this post was submitted on 21 Nov 2024
153 points (97.5% liked)

Technology

59569 readers
3825 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Today, a prominent child safety organization, Thorn, in partnership with a leading cloud-based AI solutions provider, Hive, announced the release of an AI model designed to flag unknown CSAM at upload. It's the earliest AI technology striving to expose unreported CSAM at scale.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] sexual_tomato@lemmy.dbzer0.com 6 points 1 day ago* (last edited 1 day ago) (1 children)

A generative model uses the classifier as part of its training. If you generate a picture of pure random noise, then iteratively pick random noise that the classifier says "looks" more like csam, then you can effectively generate images that the classifier says it's 100% certain is csam. Whether or not that looks anything like what a human would consider to be csam depends on other factors but it remains a possibility.

[โ€“] todd_bonzalez@lemm.ee 7 points 1 day ago

You are describing the way deepdream works, not the way modern Diffusion models work. It's the difference between psychedelic dog faces and a highly adherent generative image of a German Sheppard.

I can't imagine you're going to get anything out of this model that actually looks like CSAM, unless there's some sort of breakthrough in using these models for previously unrealized generative purposes.