this post was submitted on 29 Aug 2025
442 points (99.6% liked)
Not The Onion
18051 readers
1093 users here now
Welcome
We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!
The Rules
Posts must be:
- Links to news stories from...
- ...credible sources, with...
- ...their original headlines, that...
- ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”
Please also avoid duplicates.
Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.
And that’s basically it!
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I don't actually have a problem with this. If people are stupid enough to admit to a crime or engage in criminal activity on a platform that they don't control, that's on them. I put this as the next step of evolution from people who would commit a crime on youtube for views then get shocked pikachu'd when the police arrest them for it. They have no one to blame but themselves, they brought a 3rd party AI company into it and they did not consent to be an accomplice and if there is any company out there with the resources to have AI scan conversations to flags to send to the police with good accuracy, openAi would definitely be at the front of it.
I kinda agree. While I do want these llm companies to be more private, in terms of data retention, I think it's native to say that a company which is selling artificial intelligence to hundreds of millions of users should be totally ambivalent in the face of llm induced psychosis and suicide. Especially when the technology only gets more hazardous as it becomes more capable.