this post was submitted on 05 Mar 2024
543 points (97.2% liked)

Technology

59534 readers
3195 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

A trial program conducted by Pornhub in collaboration with UK-based child protection organizations aimed to deter users from searching for child abuse material (CSAM) on its website. Whenever CSAM-related terms were searched, a warning message and a chatbot appeared, directing users to support services. The trial reported a significant reduction in CSAM searches and an increase in users seeking help. Despite some limitations in data and complexity, the chatbot showed promise in deterring illegal behavior online. While the trial has ended, the chatbot and warnings remain active on Pornhub's UK site, with hopes for similar measures across other platforms to create a safer internet environment.

you are viewing a single comment's thread
view the rest of the comments
[–] theherk@lemmy.world 12 points 8 months ago (1 children)

Until a few years ago, when they finally stopped allowing unmoderated, user uploaded content they had a ton a very problematic videos. And they were roasted about it in public for years. Including by many who were the unconsenting, sometimes underage subjects of these videos, and they did nothing. Good that they finally did, but they trained users for years that it was a place to find that content.