this post was submitted on 16 Apr 2024
179 points (95.0% liked)

Technology

59569 readers
4136 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 24 comments
sorted by: hot top controversial new old
[–] warmaster@lemmy.world 27 points 7 months ago* (last edited 7 months ago) (1 children)

UK: Making porn of unwilling celebrities is illegal.

US: Making commercial movies with unwilling actors is perfectly fine!

[–] moon@lemmy.ml 6 points 7 months ago

Not saying actors going unpaid is okay, but one of these things is a sex crime and should absolutely be illegal

[–] yggstyle@lemmy.world 25 points 7 months ago (3 children)

Step one... create consent deepfake....

I don't like that I thought it... But It pains me to say it will be used as a defense at some point.

[–] usualsuspect191@lemmy.ca 15 points 7 months ago (1 children)

Oh wow... You could use a consent deep fake to trick another person to create the sex one. This gets messy quick...

[–] yggstyle@lemmy.world 4 points 7 months ago

Plausible deniability is a helluva thing.

[–] abhibeckert@lemmy.world 8 points 7 months ago* (last edited 7 months ago)

You're not the first to think of it and it's where this whole idea will fall flat on it's face.

There's just no way to actually check if the subject of a photo consented to having their photo taken. That was difficult enough with physical cameras, it's so much more difficult now that no camera is involved in generating the image.

I mean, if I were to post an image here in this comment - how can the Fediverse possibly verify that I have the right to post it?

[–] Entropywins@lemmy.world 4 points 7 months ago

I just imagine someone showing up to my work and presenting that contract and next thing you know I'm stuck in the dryer with only my stepson Esteban to help me...

[–] UnpluggedFridge@lemmy.world 13 points 7 months ago (1 children)

This is a difficult issue to deal with, but I think the problem lies with our current acceptance of photographs as an objective truth. If a talented writer places someone in an erotic text, we immediately know that this is a product of imagination. If a talented artist sketches up a nude of someone, we can immediately recognize that this is a product of imagination. We have laws around commercial use of likenesses, but I don't think we would make those things illegal.

But now we have photographs that are products of imagination. I don't have a solution for this specific issue, but we all need to calibrate how we establish trust with persons and information now that photographs, video, speech, etc can be faked by AI. I can even imagine a scenario in the not-too-distant future where face-to-face conversation cannot be immediately trusted due to advances in robotics or other technologies.

Lying and deception are human nature, and we will always employ any new technologies for these purposes along with any good they may bring. We will always have to carefully adjust the line on what is criminal vs artistic vs non-criminal depravity.

[–] CleoTheWizard@lemmy.world 3 points 7 months ago

I just don’t see why you’d make the creation of this stuff illegal. Right now you could be easy photoshop to put people’s faces onto dirty pictures. It hurts zero people and also takes a similar low amount of effort. As long as you keep it to yourself, society should not care.

Making it illegal also seems kind of dumb when you can just hold someone civilly liable for this stuff if they’re posting nude photos of you, real or not. I don’t see the issue of any of it if we enforce these photos spreading as if they were real and let people collect damages.

[–] shotgun_crab@lemmy.world 12 points 7 months ago* (last edited 7 months ago)

It should be a crime everywhere, but it's probably too late to regulate it anyway

[–] autotldr@lemmings.world 7 points 7 months ago

This is the best summary I could come up with:


The creation of sexually explicit deepfake content is likely to become a criminal offense in England and Wales as concern grows over the use of artificial intelligence to exploit and harass women.

Under a draft law, anyone who creates such an image or video of another adult without their consent — even if they don’t intend to share it — would face a criminal record and an unlimited fine, the UK justice department announced Tuesday.

Laura Farris, the United Kingdom’s Minister for Victims and Safeguarding, told ITV Tuesday that “to the best of (her) knowledge,” the two countries within the UK would be the first anywhere in the world to outlaw the creation of sexually explicit deepfakes.

The new offense applies only to adults as, under existing English and Welsh rules, creating deepfake sexual images of minors is already a crime.

That month, a bipartisan group of lawmakers in the United States introduced a draft civil law that, if passed, will allow the victims of sexually explicit deepfakes to sue the people who create and share such content without their consent.

“Deepfake pornography is a growing cause of gender-based harassment online and is increasingly used to target, silence and intimidate women — both on and offline,” Meta Oversight Board Co-Chair Helle Thorning-Schmidt said in a statement.


The original article contains 530 words, the summary contains 215 words. Saved 59%. I'm a bot and I'm open source!

[–] uriel238@lemmy.blahaj.zone 2 points 7 months ago (1 children)

Porn may soon be a crime in GB. The Tories are out for blood.

[–] TheGrandNagus@lemmy.world 2 points 7 months ago* (last edited 7 months ago)

People have been saying this since 2010 and it's still not illegal.

Tories are bastards who I can't wait to be electorally destroyed, but I'm tired of seeing this line trotted out.

There are zero signs that porn consumption will be a criminal offence.

[–] someguywithacomputer@lemmynsfw.com -2 points 7 months ago* (last edited 7 months ago) (1 children)

Big deal. /s Everything is a crime in England. Writing this comment is probably even a crime in England. I sure as hell don't have a license to use the computer monitor I'm viewing this on.

[–] TheGrandNagus@lemmy.world 1 points 7 months ago (2 children)

Where does this meme even come from lol

[–] warmaster@lemmy.world 2 points 7 months ago (2 children)

You are in direct violation of Penal Code 1.13, Section 9.

Please delete your account. You have twenty seconds to comply.

This has been a public service brought to you by OCP.

[–] TheGrandNagus@lemmy.world 0 points 7 months ago

Yes, very amusing.

I would like an explanation though.

[–] someguywithacomputer@lemmynsfw.com -2 points 7 months ago* (last edited 7 months ago) (1 children)

I posted this from a vpn that probably uses encryption at some point along the network path. Using encryption is a crime in England (unless they backpedaled on that already). Also, tea is stupid. Please air strike within 24 hours or at your earliest convenience.

[–] TheGrandNagus@lemmy.world 3 points 7 months ago* (last edited 7 months ago) (1 children)

Using encryption is not a crime in England lmao. Where do you get your news from, Alex Jones?

What, do you think banks store everybody's details in plain text, nobody's WiFi networks have passwords, etc?

[–] someguywithacomputer@lemmynsfw.com -1 points 7 months ago (2 children)

A month or two ago there were news articles all over lemmy about how they were banning encryption due to "safety" reasons.

[–] TheGrandNagus@lemmy.world 4 points 7 months ago

Most likely reports of dumb shit conservative politicians have called for, wanting access to messaging. You can find the exact same proposals in the US, European Parliament, individual EU countries, and a load of other places.

Dumb, sure, but a proposal that always goes nowhere isn't a law, and it's certainly not England or even UK-specific.

[–] abhibeckert@lemmy.world 4 points 7 months ago

Yeah no, those news articles were full of shit.

They're referencing the UK "Online Safety Bill" passed last year, which was very scary when it was just an idea that hadn't been written yet.

But when it finally was written and we got to see the actual content of the bill — it basically requires certain companies to use "accredited technology" to detect and block certain categories of illegal content (especially CSAM and foreign election propaganda).

All the major platforms are already taking extensive steps to block illegal content and there's a good chance they will be happy to use whatever "technology" is eventually "accredited".

A lot rides on the specifics of the "technology" which hasn't been clearly defined - but it certainly is not a ban on encryption.

[–] Wanderer@lemm.ee 1 points 7 months ago (1 children)

You need a TV licence to watch TV. But that's a law for the UK not England.

[–] TheGrandNagus@lemmy.world 1 points 7 months ago

The only weird thing about TV licence is the name. It's not really a license at all, it's just paying for a service you use.

If you watch live TV, you pay towards the broadcasting of it, plus the running of the BBC, which is ad-free in the UK.

The argument being that if it came from direct government funding, the government would have a much greater degree of control over the main national news source, which would likely be a worse solution. As would filling the BBC up with ads and cutting expensive content like proper news and excellent documentaries.

Calling it a license kind of implies you need to apply for a card or a document to watch TV, which isn't the case.

The UK is far from the only country that has publicly-funded television that you legally have to pay for to use.