this post was submitted on 18 Mar 2026
317 points (96.8% liked)

Technology

82830 readers
3191 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] wrinkle2409@lemmy.cafe 84 points 1 day ago (25 children)

I don't understand this. They're dolls, they aren't alive. Why people would care? This may be controversial, but I'd rather have a pedophile fucking a doll than raping a child

[–] ulterno@programming.dev 6 points 21 hours ago

They are making these legislations to steer people's focus away from the real CSA.

Remember. CSAM is just the symptom. CSA being the actual cause.

[–] Iconoclast@feddit.uk 17 points 1 day ago (3 children)

It's a moral panic - pure and simple. The same reason some countries want to ban cartoon/animated pictures where the fictional character looks too young. I guess the underlying assumption there is that it'll increase the number of people offending towards real children but I don't think there's any evidence to back that up.

If it was up to me, the criteria would be whether an actual person is being hurt directly or as a consequence of. That would include real violence, real pictures and possibly also GenAI stuff if it's trained on real content.

[–] ulterno@programming.dev -4 points 21 hours ago

Reduction in real pictures being distributed is not a real indicator of reduction in CSA and CSE either.

A simple anecdote to show it:
How many pictures of Epstein with children are in distribution? How many for his clients?
vs the actual lives he and his gang destroyed.

The small timers are easier to catch and cull with traditional policing and internet restrictions/surveillance is going to do nothing to them in the face of what it will do to absolutely everyone else.

As far as the company in the post goes, better of letting them sell in your country, so you can easily put their customers on a watchlist, rather than be unknown until they start harming real people.

load more comments (2 replies)
[–] village604@adultswim.fan 75 points 1 day ago* (last edited 1 day ago) (3 children)

Exactly. Same with faux bait stuff. I personally think it's gross so I don't consume it, but if everyone is a consenting adult and it stops people from consuming real CSAM I can't really support banning it.

But the problem many people have with stuff like that is they assume the people consuming it will go on to do it to real people, which is the same argument they tried to use against violent video games.

[–] MagicShel@lemmy.zip 3 points 23 hours ago

A less obvious problem with AIGen CSAM is that the sheer volume of it could make it nearly impossible to track down actual cases of abused children. I am not particularly morally concerned with someone generating it — I don't think it directly harms any child and I'm not entirely convinced it harms the consumer. And if those were the only considerations, I'd say have at it (subject to further research because I don't think it is conclusive that it is harmless to the consumer, either).

But if it means law enforcement agencies have to give up prosecuting pedo rings of actual abusers because they can't tell which images among the thousands are real, well that is real harm to real victims and that is enough to ban it.

[–] idiomaddict@lemmy.world 15 points 1 day ago (1 children)

It wouldn’t compel me to hurt people, but I definitely get more into kinks the more time I spend with them (to a point). Violence in media has never had a noticeable effect on me though.

[–] nymnympseudonym@piefed.social 24 points 1 day ago* (last edited 1 day ago) (1 children)

Society would probably actually benefit from a non political purely objective science-based commission to review published data, make recommendations for new studies, and come up with an evidence-based recommendation to governments about whether virtual CSAM (no actual children harmed or in AI training data) and lifelike child sex dolls result in statistically more child predation.

I haven't deep dived on this so maybe it's already well known among sociologists/psych pathologists. But the key is a trusted science-based policy. We did it for violent video games and found no correlation. Not at all obvious to me if that also holds for pederasty.

Yeah I know, the trusted scientific commission is not going to happen

[–] Iconoclast@feddit.uk 22 points 1 day ago (12 children)

whether virtual CSAM (no actual children harmed or in AI training data) and lifelike child sex dolls result in statistically more child predation.

It could but I doubt that it would. Pedophiles don't rape children - rapists do. Being both is rare. Having been born with attraction to children doesn't mean they automatically also lack a moral compas and self-control. Most of them know it's wrong and never offend. The vast majority of people in prison for child sexual abuse aren't pedophiles but just good old rapists. Kids simply make an easy target.

[–] ZILtoid1991@lemmy.world 1 points 22 hours ago

Most reformed pedophiles also get reformed before offense, so...

[–] ulterno@programming.dev 0 points 21 hours ago

Kids simply make an easy target.

This is the most relevant point I have seen to the current scene, so far.

Also, boarding schools.

load more comments (10 replies)
[–] pHr34kY@lemmy.world 2 points 1 day ago (1 children)

Generated CSAM is banned. For the same reason, something like this should follow.

[–] Iconoclast@feddit.uk 2 points 19 hours ago

The case for banning simulated CSAM produced with GenAI is that if the training data contains actual CSAM then it would be directly contributing to real children being hurt. Obviously generating those pictures doesn't further cause physical harm to anyone but someone has to already have been harmed in the past for that training data to exist in the first place.

This however is not true with cartoons for example nor does it apply to sex dolls either.

[–] zach@lemmy.dbzer0.com 14 points 1 day ago (3 children)

I believe the last time something like this came up, the argument was raised that it normalizes the behavior and leads to escalation, i.e. “they’re just illustrations” “it’s just a doll” to “I’m just taking photos” or “it’s just touching”, this time against actual victims

[–] RaoulDuke85@piefed.social 27 points 1 day ago (1 children)

They said same sex marriage would lead to bestiality

[–] Soup@lemmy.world -2 points 22 hours ago

Fake kids to real kids is very different than some crazy fucko thinking same-sex marriage would lead to fucking animals. Are you for real?

[–] dustyData@lemmy.world 37 points 1 day ago* (last edited 1 day ago)

Slippery slope fallacy. We know that consumption of real CSAM might increase frustration and lead to pursuit of real crimes. However, we don't have the same level of evidence for illustrations or sex dolls. It's a massive blind side in the scientific literature. It's very hard to study.

Despite this, the number one risk factor still remains unsupervised access to minors. Regardless of whether the abuser consumes abuse media or not.

[–] Feyd@programming.dev 11 points 1 day ago (3 children)

Does the research support this argument though? (Spoiler: it doesn't)

[–] 5too@lemmy.world 4 points 20 hours ago* (last edited 20 hours ago) (1 children)

To my knowledge, there is very little research at all - the programs that would look into whether this might protect or endanger children struggle to get funded, because it's icky.

[–] frongt@lemmy.zip 3 points 19 hours ago

And anyone looking into it immediately gets labeled as defending abusers.

load more comments (2 replies)
[–] AnotherUsername@lemmy.ml 1 points 20 hours ago (1 children)

In theory this is non-harmful. In practice this is part of a fantasy escalation ladder that leads bad places. Your actions are led by your thoughts, and you are the thoughts you feed. In reality it's a good thing to not feed thoughts of abusing children.

I'd note that I'd be similarly uncomfortable with people buying hyper-realistic dolls to practice amateur torture on, but I'm ok with people buying silicone dolls to practice tattoo art and wound stitching on. The difference being intent, which is a line I'm equally unhappy with the government drawing. Someone slicing up a slab of silicone shaped like a baby because they have a desperate desire to hurt babies that they are actively feeding into is bad. Someone practicing stitching up silicone babies after injuries because they always wanted to be a doctor and never got the chance is healthier and fine. It's the "what are you feeding with this action?" Problem of governance.

[–] Randomgal@lemmy.ca 20 points 20 hours ago

This is 'videogsmes cause school shootings' logic. There are better arguments than this.

[–] ZILtoid1991@lemmy.world 1 points 22 hours ago

If it's lifelike, I can understand it, because that's where I also draw the line when it comes to drawings and the likes.

[–] Kowowow@lemmy.ca 4 points 1 day ago

Hey if nothing else it gives you a decent idea of who to watch

load more comments (16 replies)