this post was submitted on 21 May 2024
510 points (95.4% liked)

Technology

59569 readers
3825 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] crazyminner@lemmy.ml 24 points 6 months ago* (last edited 6 months ago) (4 children)

I had an idea when these first AI image generators started gaining traction. Flood the CSAM market with AI generated images( good enough that you can't tell them apart.) In theory this would put the actual creators of CSAM out of business, thus saving a lot of children from the trauma.

Most people down vote the idea on their gut reaction tho.

Looks like they might do it on their own.

[–] DarkThoughts@fedia.io 10 points 6 months ago (1 children)

It's such an emotional topic that people lose all rationale. I remember the Reddit arguments in the comment sections about pedos, already equalizing the term with actual child rapists, while others would argue to differentiate because the former didn't do anything wrong and shouldn't be stigmatized for what's going on in their heads but rather offered help to cope with it. The replies are typically accusations of those people making excuses for actual sexual abusers.

I always had the standpoint that I do not really care about people's fictional content. Be it lolis, torture, gore, or whatever other weird shit. If people are busy & getting their kicks from fictional stuff then I see that as better than using actual real life material, or even getting some hands on experiences, which all would involve actual real victims.

And I think that should be generally the goal here, no? Be it pedos, sadists, sociopaths, whatever. In the end it should be not about them, but saving potential victims. But people rather throw around accusations and become all hysterical to paint themselves sitting on their moral high horse (ironically typically also calling for things like executions or castrations).

[–] Cupcake1972@mander.xyz 3 points 6 months ago

Yeah, exact same feelings here. If there is no victim then who exactly is harmed?

[–] Itwasthegoat@lemmy.world 8 points 6 months ago (1 children)

My concern is why would it put them out of business? If we just look at legal porn there is already beyond huge amounts already created, and the market is still there for new content to be created constantly. AI porn hasn't noticeably decreased the amount produced.

Really flooding the market with CSAM makes it easier to consume and may end up INCREASING the amount of people trying to get CSAM. That could end up encouraging more to be produced.

[–] crazyminner@lemmy.ml 4 points 6 months ago

The market is slightly different tho. Most CSAM is images, with Porn theres a lot of video and images.

[–] jaschen@lemm.ee 4 points 6 months ago

It's also a victimless crime. Just like flooding the market with fake rhino horns and dropping the market price to a point that it isn't worth it.

[–] Glass0448@lemmy.today -4 points 6 months ago (1 children)

It would be illegal in the United States. Artistic depictions of CSAM are illegal under the PROTECT act 2003.

[–] TheGrandNagus@lemmy.world 4 points 6 months ago* (last edited 6 months ago)

And yet it's out there in droves on mainstream sites, completely without issue. Drawings and animations are pretty unpoliced.