this post was submitted on 05 Mar 2024
543 points (97.2% liked)

Technology

59589 readers
3376 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

A trial program conducted by Pornhub in collaboration with UK-based child protection organizations aimed to deter users from searching for child abuse material (CSAM) on its website. Whenever CSAM-related terms were searched, a warning message and a chatbot appeared, directing users to support services. The trial reported a significant reduction in CSAM searches and an increase in users seeking help. Despite some limitations in data and complexity, the chatbot showed promise in deterring illegal behavior online. While the trial has ended, the chatbot and warnings remain active on Pornhub's UK site, with hopes for similar measures across other platforms to create a safer internet environment.

you are viewing a single comment's thread
view the rest of the comments
[–] _cnt0@sh.itjust.works 43 points 8 months ago (4 children)

Non-paywall link: https://web.archive.org/web/20240305000347/https://www.wired.com/story/pornhub-chatbot-csam-help/

There's this lingering implication that there is CSAM at Pornhub. Why bother with "searches for CSAM" if it does not return CSAM results? And what exactly constitutes a "search for CSAM"? The article and the linked one are incredibly opaque about that. Why target the consumer and not the source? This feels kind of backwards and like language policing without really addressing the problem. What do they expect to happen if they prohibit specific words/language? That people searching for CSAM will just give up? Do they expect anything beyond them changing the used language and go for a permanent cat and mouse game? I guess I share the sentiments that motivated them to do this, but it feels so incredibly pointless.

[–] TheBlackLounge@lemm.ee 23 points 8 months ago (3 children)

Lolicon is not illegal, and neither is giving your video a title that implies CSAM.

That begs the question, what about pedophiles who intentionally seek out simulated CP to avoid hurting children?

[–] SquiffSquiff@lemmy.world 21 points 8 months ago (1 children)

Simulated CP is legally considered the same as 'actual' CP in the UK

[–] CaptainEffort@sh.itjust.works 10 points 8 months ago (1 children)

Which is, imo, pretty dumb. If it gives these people an outlet that literally hurts no one, I say they should be allowed to use it. Without it they’ll just go to more extreme lengths to get what they need, and as such may go to places where actual real life children are being abused or worse.

So while it’s still disgusting and I’d rather not think about it, if nobody’s being hurt then it’s none of my business. Let them get out their urges in a safe way that doesn’t affect anybody else.

[–] afraid_of_zombies@lemmy.world 9 points 8 months ago (1 children)

I imagine the concern is that it would look identical to the real thing. Which blurs the lines. Kinda like how governments really hate when toy makers make toy guns look too real and why I have to tell airport security that I would like my bag searched now since there are homemade looking electronic devices in it.

I guess in theory some government could make a certification system. Where legal simulated cp has like some digital watermark or something but you know that would involve a government paying someone to review child porn for a living. Kinda hard to sell that to the taxpayers or fill that role. Maybe the private sector would be willing to do it but that is a big ask.

I am not sure I agree with you or disagree with you. Maybe all of us would be better off if there is a legal and harmless way for pedos to get what they want. Or maybe it is bad to encourage it at all even in a safe way, like if they consume that stuff it will make them more likely to seek out real children.

Definitely isn't a great situation be great if the condition is cured some day.

[–] YarHarSuperstar@lemmy.world 4 points 8 months ago (1 children)

This covered a lot of my concerns and thoughts on the topic. I want these people to be able to seek help and possibly even have a legal outlet that is not harming anyone, i.e. not even someone who has to view that shit for a living, so maybe we get AI to do it? IDK. It's complicated but I believe that it's similar to having an addiction in some ways and should be treated as a health issue, assuming they haven't hurt anyone and want help. This is coming from someone with health issues including addiction and also someone who is very empathetic and sympathetic to any and all struggles of folks who are just trying to live better.

[–] afraid_of_zombies@lemmy.world 2 points 8 months ago

I can't even imagine the amount of money it would cost for someone to pay me to watch and critique child porn for a living. I have literally been paid money in my life to fish a dead squirrel that was making the whole place stink, from underneath a trailer in July and would pick doing that professionally over watching that filth.

[–] Clbull@lemmy.world 5 points 8 months ago

Depends on the jurisdiction. Indecent illustrations and 'pseudo photographs' depicting minors are definitely illegal in the UK (Coroners and Justice Act 2009.) Several US states are also updating their laws to clamp down on this too.

I'm also aware that it's illegal in Switzerland because a certain infamous rule 34 artist fled his home country to evade justice for that very reason.

[–] archomrade@midwest.social 3 points 8 months ago (2 children)

I imagine high exposure (for individuals who are otherwise not explicitly searching for such material) could inadvertently normalize that behavior IRL.

[–] CaptainEffort@sh.itjust.works 9 points 8 months ago* (last edited 8 months ago) (3 children)

Like how video games supposedly normalize violence? Are you going to go shoot a bunch of people because GTA exists?

Ffs guys what year is this? Thought we were past this silly mindset.

load more comments (3 replies)
[–] _cnt0@sh.itjust.works 6 points 8 months ago (11 children)

Like exposure to gay people and gay content makes you gay? (/s if it wasn't obvious)

[–] squid_slime@lemmy.world 5 points 8 months ago* (last edited 8 months ago) (2 children)

no very different, but if someone hasn't come out then having gay media will normalize being gay and id assume they could come out with less stigma but this is a painfully ignorant and insulting comparison

[–] _cnt0@sh.itjust.works 3 points 8 months ago (9 children)

but this is a painfully ignorant and insulting comparison

Only if you condemn the disposition and not its inacceptable form of execution. From where I stand being attracted to children is as acceptable as men being attracted to men. Abusing children is as inacceptable as men raping men. If it is, in your book, fine to condemn pedophiles for being pedophile, then christian fundamentalists are totally fine hating homosexuals for being homosexual. Don't get me wrong, I'm neither condoning nor encouraging the (sexual) abuse of children. Unlike you I'm just not a hypocrite about different sexual orientations/preferences that nobody chooses. The only qualitative difference is that in one case one side cannot consent and needs better protection by society. The only point I am (consistently) trying to make here, is that I find it highly dubious that the measures described in the article have any impact on said required protection, and that the article completely fails to provide any shred of evidence or even indication that it does.

[–] archomrade@midwest.social 3 points 8 months ago* (last edited 8 months ago) (2 children)

TW: discussions about sexual abuse

spoiler

If it is, in your book, fine to condemn pedophiles for being pedophile, then christian fundamentalists are totally fine hating homosexuals for being homosexual.

Fetishizing an abusive sexual behavior is not the same as same-sex attraction. We would be having the same conversation if we were talking about rape porn between adults: it's the normalization of the abusive behavior that we're primarily concerned with, not the ethics of watching simulated abuse in general.

While I don't believe that banning simulated material would be helpful, it is completely reasonable to suggest that cautioning individuals about the proximity of their search to material that is illegal - and the risks associated with consuming it - would be preventative against future consumption.

Especially considering Pornhub is only placing cautions around that material and isn't removing that content generally. It's hard to read your objections as anything other than pedophilia apologia.

[–] _cnt0@sh.itjust.works 5 points 8 months ago (8 children)

Being attracted to an abusive sexual behavior is not the same as being attracted to a consenting behavior between adults.

And I did not even hint at anything even close to the contrary.

We would be having the same conversation if we were talking about rape porn between adults: [...]

Which is exactly the comparison I made.

[...] it's the normalization of the abusive behavior that we're primarily concerned with, not the ethics of watching simulated abuse in general.

I wasn't talking about the normalization of anything anywhere. You inject a component, that wasn't the subject in our conversation before, to defend a point I wasn't questioning (red herring).

While I don't believe that banning simulated material would be helpful, [...]

Another topic which we could discuss, but which - again - you just injected.

[...]it is completely reasonable to suggest that cautioning individuals about the proximity of their search to material that is illegal - and the risks associated with consuming it - would be preventative against future consumption.

And again: I'm asking for qualitative and quantitative proof of that. It is the one and only thing I was and am questioning about the article.

Especially considering Pornhub is only placing cautions around that material and isn't removing that content generally.

The point to our discussion being what?

It's hard to read your objections as anything other than pedophilia apologia.

You seem to have major trouble with text comprehension and staying on track with discussions.

load more comments (8 replies)
[–] Gabu@lemmy.world 3 points 8 months ago (1 children)

Minor complaint: try to get an empty paragraph between the spoiled text and the non-spoiled text whenever possible - makes it easier to read.

Regarding the discussion, you're both right at the end of the day. Limiting exposure to illegal and immoral-adjacent material is obviously in society's interest, but at the same time the implication that a glorified ad for a mental illness helpline is a good solution is ludicrous - it's at the absolute bottom of the barrel when it comes to the kinds of issues we should be working on.

load more comments (1 replies)
load more comments (8 replies)
[–] Schadrach@lemmy.sdf.org 2 points 8 months ago (1 children)

How so? If CP and things adjacent to it (drawn stuff, "teen" porn, catholic schoolgirl outfits, etc) content is going to ~~make people~~ promote and encourage people to molest children, why wouldn't gay porn promote and encourage homosexuality?

Like this is one of those things that feels a lot like picking and choosing based on preference. I suspect violence in media being a historic right wing talking point is the only reason it's not on the bad list like sexy women and loli stuff.

[–] squid_slime@lemmy.world 1 points 8 months ago* (last edited 8 months ago)

this is an entirely different discussion. My point and issue is with the comparison being in poor taste, like I said previously I'd be equally annoyed if someone made a comparison with heterosensuality and beastiality one is normal and the other is morally wrong.

Edit: my mistake I thought you replied to a different comment.

We are products of our environment. I do believe that we are effected by the things around us, I'd imagine we'd have a lot more pedophiles if cp was on TV. Look at any industry built on abuse, people don't go in thinking they'll be the bad guy and fuck up someone's day, they themselves are introduced to it through environment.

[–] afraid_of_zombies@lemmy.world 2 points 8 months ago (1 children)

Not exactly a fair analogy. First off it is willful exposure to cp not incidental. Secondly the concern isn't that someone is oriented towards children the concern is the action. We can't and should never ever attempt to police a person's mind we can however as a society demand that adults don't rape kids. Homosexuality is not the same, the vast majority of western society is fine with the action. So even if you could demonstrate a link between watching gay porn more and being more willing to have gay sex it doesn't matter.

[–] _cnt0@sh.itjust.works 5 points 8 months ago

Nice rephrasing of what I said (mostly). Homosexuality - and heterosexuality, and any sexuality for that matter - are only acceptable as long as there is consent. The only difference is, as I've pointed out, that with pedophilia there is no scenario which can have consent. That doesn't matter though, as long as it stays in somebody's mind or the virtual realm.

If you strictly distinguish between desire and action, it is an absolutely fair comparison. I do, and I do so explicitly. Some people don't, ignore that I do, and then get wound up about what they think I said.

load more comments (9 replies)
[–] where_am_i@sh.itjust.works 11 points 8 months ago (1 children)

Also: "they actually track that I was searching for something illegal, let me rather not do it again".

[–] _cnt0@sh.itjust.works 6 points 8 months ago (2 children)

Like anything on the internet wasn't tracked. If need be people will resort to physically exchanging storage media.

[–] Blueberrydreamer@lemmynsfw.com 12 points 8 months ago (1 children)

But having that tracking shown to you has a very powerful psychological effect.

It's pretty well established that increasing penalties for crimes does next to nothing to prevent those crimes. But what does reduce crime rates is showing how people were caught for crimes, making people believe that they are less likely to 'get away with it'.

Being confronted with your own searches is an immediate reminder that the searcher is doing something illegal, and that they are not doing so unnoticed. That's wildly different than abstractly knowing that you're probably being tracked somewhere by somebody among billions of other people.

[–] _cnt0@sh.itjust.works 3 points 8 months ago (1 children)

And where is the quantification and qualification for that? Spoiler: it's not in the article(s) and not one google search away. Does Nintendo succeed in stopping piracy with its show trials? If you have a look around here, it more looks like people are doubling down.

[–] Blueberrydreamer@lemmynsfw.com 2 points 8 months ago (2 children)

I mean, I know Google has been shitty lately, but Wikipedia isn't hard to find: https://en.m.wikipedia.org/wiki/Deterrence_(penology)

I'd wager Nintendo has put some fear into a few folks considering developing emulators, but that's the only comparison to be made here. The lack of any real consequences for individuals downloading roms is why so many are happy to publicly proclaim their piracy.

Now, I bet if megaupload added an AI that checked users uploads for copyrighted titles and gave everyone trying to upload them a warning about possible jail time, we'd see a hell of a lot less roms and movies on mega.

[–] _cnt0@sh.itjust.works 1 points 8 months ago (1 children)

Now, I bet if megaupload added an AI that checked users uploads for copyrighted titles and gave everyone trying to upload them a warning about possible jail time, we'd see a hell of a lot less roms and movies on mega.

It would simply obsolete megaupload. Sharing platforms come and go. If one distribution channel stops working, people will use (or create) another.

load more comments (1 replies)
[–] _cnt0@sh.itjust.works 1 points 8 months ago

Btw, you might want to read that wiki page in full yourselves.

[–] Zorsith@lemmy.blahaj.zone 5 points 8 months ago

One will never exceed the bandwidth of a semi loaded with hard drives

[–] Silentiea@lemm.ee 9 points 8 months ago (1 children)

Why target the consumer and not the source?

If for no other reason than it doesn't have to be either/or. If you can meaningfully reduce demand for a "product" as noxious as CSAM, you should expect the rate of production to slow. There are certainly efforts in place to prevent that production from ever being done, and to prevent it from being shared/hosted once it is, but I don't think attempting to reduce demand in this way is going to hurt.

[–] _cnt0@sh.itjust.works 6 points 8 months ago (4 children)

Does it reduce the demand though? Where are the measurements attesting to that? If history has shown one thing, it is that criminalizing things creates criminals. Did the prohibition stop people from making, trading, or consuming alcohol? How does this have any meaningful impact on the abuse of children? The article(s) completely fail to elaborate on that end. I'm missing the statistics/science here. What are the measuring instruments to assess any form of success? Just that searches were blocked and people were shown some links? ... TL;DR: is this something with an actual positive impact or just an exercise in virtue signaling and waste of time and money? Blind "fixes" are rarely useful.

load more comments (4 replies)
[–] afraid_of_zombies@lemmy.world 5 points 8 months ago (1 children)

Maybe liability or pretending to help? That way they can claim later on "we care about people struggling with this issue which is why when they search for terms related to it we offer the help they need". Kinda how if you search for certain terms on Google it pops up suicide hotline on top.

Ok Google just because I looked up some stuff on being sad in winter doesn't mean I am planning to put a gun in my mouth.

[–] _cnt0@sh.itjust.works 2 points 8 months ago

Yah, this feels more like a legal protection measure and virtue signaling. There's absolutely no assessment of efficiency or even efficacy of the measures. At least not in the article or the ones it links to and I couldn't find anything substantial on it.