this post was submitted on 10 Feb 2024
255 points (97.8% liked)

Technology

59569 readers
3431 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] jeena@jemmy.jeena.net 3 points 9 months ago* (last edited 9 months ago) (2 children)

Ste spam is bad but I can just ignore it, but last week there was an attack with CSAM which showed up while casually surfing new, that made me not want to open Lemmy anymore.

I think that is what needs to be fixed before we can taccle spam.

[–] misk@sopuli.xyz 2 points 9 months ago

Whatever is done to fight spam should be useful in fighting CSAM too. Latest "AI" boom could prove lucky for non-commercial social networks as content recognition is something that can leverage machine learning. Obviously it's a significant cost so pitching in will have to be more common in covering running costs.

Admins are actively looking into solutions, nobody wants that stuff stored on their server, and there's a bunch of legal stuff you must do when it happens.

One of the problems is the cost of compute power for running programs detecting CSAM in pictures before uploading, making it not viable for many instances. Lemmy.world is moving towards only allowing images hosted via whitelisted sites I think.