this post was submitted on 11 Dec 2025
570 points (96.4% liked)

Technology

77635 readers
2577 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Kyrgizion@lemmy.world 21 points 2 days ago (5 children)

This was about sending a message: "stfu or suffer the consequences". Hence, subsequent people who encounter similar will think twice about reporting anything.

[–] Devial@discuss.online 30 points 2 days ago* (last edited 2 days ago) (4 children)

Did you even read the article ? The dude reported it anonymously, to a child protection org, not google, and his account was nuked as soon as he unzipped the data, because the content was automatically flagged.

Google didn't even know he reported this, and Google has nothing whatsoever to do with this dataset. They didn't create it, and they don't own or host it.

[–] Whostosay@sh.itjust.works 1 points 2 days ago (3 children)

It seems they did react to it though

[–] Devial@discuss.online 12 points 2 days ago* (last edited 2 days ago) (1 children)

They didn't react to anything. The automated system (correctly) flagged and banned the account for CSAM, and as usual, the manual ban appeal sucked ass and didn't do what it's supposed to do (also whilst this is obviously a very unique case, and the ban should have been overturned on appeal right away, it does make sense that the appeals team, broadly speaking, rejects "I didn't know this contained CSAM" as a legitimate appeal reason). This is barely news worthy. The real headline should be about how hundreds of CSAM images were freely available and sharable from this data set.

[–] Whostosay@sh.itjust.works 3 points 2 days ago (1 children)

An automatic reaction is a reaction

[–] Devial@discuss.online 10 points 2 days ago* (last edited 2 days ago)

They reacted to the presence of CSAM. It had nothing whatsoever to do with it being contained in an AI training dataset, as the comment I originally replied to states.

load more comments (1 replies)
load more comments (1 replies)
load more comments (1 replies)