this post was submitted on 05 Mar 2024
543 points (97.2% liked)

Technology

59534 readers
3195 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

A trial program conducted by Pornhub in collaboration with UK-based child protection organizations aimed to deter users from searching for child abuse material (CSAM) on its website. Whenever CSAM-related terms were searched, a warning message and a chatbot appeared, directing users to support services. The trial reported a significant reduction in CSAM searches and an increase in users seeking help. Despite some limitations in data and complexity, the chatbot showed promise in deterring illegal behavior online. While the trial has ended, the chatbot and warnings remain active on Pornhub's UK site, with hopes for similar measures across other platforms to create a safer internet environment.

you are viewing a single comment's thread
view the rest of the comments
[–] Mostly_Gristle@lemmy.world 69 points 8 months ago (7 children)

The headline is slightly misleading. 2.8 million searches were halted, but according to the article they didn't attempt to figure out how many of those searches came from the same users. So thankfully the number of secret pedophiles in the UK is probably much lower than the headline might suggest.

[–] preasket@lemy.lol 75 points 8 months ago (2 children)

I suspect a lot of CSAM searches come from underage users themselves

[–] Dran_Arcana@lemmy.world 32 points 8 months ago (3 children)

I'd think it's probably not a majority, but I do wonder what percentage it actually is. I do have distinct memories of being like 12 and trying to find porn of people my own age instead of "gross old people" and being confused why I couldn't find anything. Kids are stupid lol, that's why laws protecting them need to exist.

Also good god when I become a parent I am going to do proper network monitoring; in hindsight I should not have been left unattended on the internet at 12.

[–] kylian0087@lemmy.world 14 points 8 months ago

I was the same back then. And have come across some stuff which is surprisingly easy to find. Later to realize how messed up that was.

I think monitoring is good but it has a fine line not to cross in your child privacy. If they suspect anything they sure know how to work around it and you loose any insight.

[–] Rinox@feddit.it 6 points 8 months ago* (last edited 8 months ago)

It's not about laws, it's about sexual education. Sexual education is a topic that can't be left to the parents and should be explained in school, so as to give the kids a complete knowledge base.

Most parents know about sex as much as they know about medicines. They've had some, but that doesn't give them a degree for teaching that stuff.

[–] Piece_Maker@feddit.uk 2 points 8 months ago

Sorry I know this is a serious subject and not a laughing matter but that's a funny situation. I guess I was a MILF hunter at that age because even then I was perfectly happy to knock one out watching adult porn instead!

load more comments (4 replies)