this post was submitted on 16 Jun 2024
354 points (97.8% liked)
Technology
59534 readers
3195 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I'm positive they got notified they were hosting a massive amount of CSAM, or similarly awful AI generated shit since it's the Wild West out there now. This was their only way out.
Sounds like a smokescreen to me. All file sharing services have this problem. The solution is to respond to subpoena requests and let the government do their jobs. They do not have to allow themselves to arbitrarily violate their users privacy in order to do that.
No, they don't. If you're storing something that is found by a law enforcement agency, you are legally liable. That's the difference.
You can't just say out loud "Hey users, please stop storing CSAM on our servers." Not how that works.
Adobe is not a video distribution platform. They do not have this level of culpability.
Adobe CLOUD requires storage of images and video on their servers to edit them. That's what this is about.
That's not the same as content distribution.
Sharing content to clients cannot be effectively done through creative cloud.
It does not make sense to try and stop the distribution at the level of video editing. Not only is the thought of child predators making regular use of professional editing software completely absurd, but even if you assume they do, why the fuck do you think they would use the inbuilt cloud sharing tools to do so?? They would just encrypt the contents and transmit it over any other file sharing service...
It makes no sense to implement this measure because it does absolutely nothing to impede criminals, but enables a company well known for egregious privacy violations unprecedented access to information completely law abiding clients have legitimate reasons to want to keep private.
It is a farce. A smokescreen intended to encroach on customers precious data all the while doing nothing to assist law enforcement.
I realize it's gross and icky and morally problematic, but I really wonder if trying to have the government crackdown on AI generated CSAM is worth the massive risk to freedom of speech and privacy that it seems like it's going to open us up to. It's a massive blank check to everyone to become a big brother.
There are no laws about it anywhere right now, but I'm sure it's about something more real. As this has played out many times in the past (Amazon, Apple, Google FB..etc) across many different policing agencies: if they identify a HUGE host that is a problem, they notify them first and allow them to address the issue before making themselves known and cracking down on the offenders. This just feels like that same thing again.
AI or not, if a court can prosecute a case, they'll do so.