this post was submitted on 29 Oct 2025
742 points (99.6% liked)

Not The Onion

18537 readers
1340 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
(page 2) 35 comments
sorted by: hot top controversial new old
[–] phoenixz@lemmy.ca 24 points 1 week ago* (last edited 1 week ago) (4 children)

One wonders how much child porn was in there....

But it's AI, so itsa aaaaalllll fine

[–] Evotech@lemmy.world 8 points 1 week ago

It's kinda weird how spwsific you have to be with certain models to not make them vwry very young looking people

load more comments (3 replies)
[–] vane@lemmy.world 21 points 1 week ago

that's mafia level response

[–] sundray@lemmus.org 14 points 1 week ago

So their AI is so technologically weak that they can’t even blame it for all the porn downloads, eh. SMH.

[–] 843563115848@lemmy.zip 11 points 1 week ago

Liars. So many liars these days...

[–] BilSabab@lemmy.world 11 points 1 week ago (6 children)

Ok, but why would anyone bother training AI on porn? Seriously, I don't understand

[–] calcopiritus@lemmy.world 21 points 1 week ago (2 children)

Gen AI porn and shitposts are the only 2 decent use cases I've seen of gen AI.

You can't make half of those without training it on porn.

[–] Sabata11792@ani.social 2 points 1 week ago

At this point, I'm betting you don't hear news about image models as the cutting edge is in anime titties and celeb nudes.

load more comments (1 replies)
[–] moondoggie@lemmy.world 19 points 1 week ago (1 children)

An article I read when this came up a couple of months ago said that basically porn was the best way to show AI unclothed human movement. Watching clothed humans move, you can get the basics but can’t see how the muscles are working. If you show it a Hollywood movie, they might see occasional shirtless scenes or artfully lit and blocked out sex scenes. Porn has the greatest amount of naked people moving their bodies.

[–] BilSabab@lemmy.world 8 points 1 week ago (1 children)
load more comments (1 replies)
[–] mangaskahn@lemmy.world 10 points 1 week ago (1 children)

Video generation, copyright matching, CSAM detection, those are just the first few that pop into my head.

[–] BilSabab@lemmy.world 2 points 1 week ago

got it. thx

[–] falseWhite@lemmy.world 8 points 1 week ago (2 children)

Have you not heard that everyone is now doing NSFW chatbots? Meta is just trying to catch up with Grok and Chatgpt

[–] BilSabab@lemmy.world -1 points 1 week ago (5 children)

I've heard about that. I just don't really understand why would anyone waste their resources on it. AI training is too expensive to waste it on that.

load more comments (5 replies)
load more comments (1 replies)
[–] Mwa@thelemmy.club 2 points 1 week ago (1 children)

probably for the AI to learn how to detect NSFW content???

load more comments (1 replies)
load more comments (1 replies)
[–] GaryGhost@lemmy.world 5 points 1 week ago* (last edited 1 week ago)

https://youtu.be/a3PmewuSb1w

Most for personal use, m-porno

[–] BonesOfTheMoon@lemmy.world -4 points 1 week ago (4 children)

As an aside, a great deal of CSAM is shared through Facebook, they've been asked by CSAM survivors to stop this and they said no. The advocacy survivor group Phoenix 11 submitted six formal questions in the US Congress to old Zuckface fuckface about it, as he deployed end to end encryption which makes this possible, which he dodged like the lying fuck he is. Zuck would sell it himself if it made him a whole dollar and nobody should forget that.

load more comments (4 replies)
load more comments
view more: ‹ prev next ›