this post was submitted on 11 Jul 2024
233 points (96.8% liked)
Technology
59534 readers
3195 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Are you surprised by teenage boys making fake nudes of girls in their school? I'm surprised by how few of these cases have made the news.
I don't think there's any way to put this cat back in the bag. We should probably work on teaching boys not to be horrible.
I'm not sure you can teach boys not to be horny teenagers π
Being horny is one thing, sharing this stuff another. If whoever did the fake would've kept it to themselves, then nobody would've even known. The headline still is ass and typical "AI" hysteria though.
Having been a teenage boy myself, I wouldn't dream of trying.
But I knew it wasn't OK to climb a tree with binoculars to try to catch a glimpse of the girl next door changing clothes, and I knew it wasn't OK to touch people without their consent. I knew people who did things like that were peeping toms and rapists. I believed peeping toms and rapists would be socially ostracized and legally punished more harshly than they often are in reality.
Making and sharing deepfakes of real people without their consent belongs on the same spectrum.
We do eventually grow up at least
... into horny men
... but hopefully with a little more empathy and propriety.
There are always two paths to take - take away all of humanityβs tools or aggressively police people who abuse them. No matter the tool (AI, computers, guns, cars, hydraulic presses) there will be somebody who abuses it, and for society to function properly we have to do something about the delinquent minority of society.
Hydraulic press channel guy offended you somehow? I'm missing something here.
No, just an example. But if youβve ever noticed the giant list of safety warnings on industrial machinery, you should know that every single one of those rules was written in blood.
Sometimes other bodily fluids.
The machines need to be oiled somehow.
π€¨ vine boom
Either Darwin awards or assholes, most likely. Those warnings are written due to fear of lawsuit.
However this tool doesn't have any safety warnings written on it. The App they used specifically caters for use-cases like this. They advertise to use it unmorally and we have technology to tell age from pictures for like 10 years. And they deliberately chose to have their tool generate pictures of like 13 yo girls. In the tool analogy that's like selling a jigsaw that you're very well aware of, misses some well established safety standards and is likely to injure someone. And it's debatable whether it was made to cut wood anyways, or just injure people.
And the rest fits, too. No company address, located in some country where they can't be persecuted... They're well aware of the use-case of their App.
I don't think they're offended. I think they're saying that a tool is a tool. A gun or AI are only dangerous if misused, like a hydraulic press.
We can't go around removing the tools because some people will abuse them. Any tool can kill someone.
We could also do a better job of teaching people from childhood not to be assholes.
Guns do not belong in the list. Guns are weapons, not tools. Don't bother posting some random edge case that accounts for approximately 0.000001% of use. This is a basic category error.
Governments should make rules banning and/or regulating weapons.
Weapons are tools, by strict definition, and there are legitimate uses for them. Besides, my point was that they should be regulated. In fact, because they are less generally useful than constructive tools, they should be regulated far MORE strictly.
It's like these x-ray apps that obviously didn't work but promoted to see all the women naked. Somehow that was very cool and no one cared. Suddenly there is something that kinda works and everyone is shocked.