this post was submitted on 06 Feb 2024
230 points (97.9% liked)

Technology

59534 readers
3183 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Deepfake scammer walks off with $25 million in first-of-its-kind AI heist::Hong Kong firm tricked by simulation of multiple real people in video chat, including voices.

top 23 comments
sorted by: hot top controversial new old
[–] theskyisfalling@lemmy.dbzer0.com 38 points 9 months ago (5 children)

What kind of company let's a single employee transfer that amount of money without multiple different password entries or checks from different people though, seriously?

Doesn't matter if they had a conference call with what appeared to be certain people as the article says they could easily have used key pair verification such as pgp. Sounds like poor security all around especially considering the amounts involved.

[–] WhatAmLemmy@lemmy.world 15 points 9 months ago* (last edited 9 months ago) (3 children)

PGP? Have you ever dealt with any banking or financial corporations? You'd have better luck getting the money handlers and decision makers to authenticate transactions with magic.

Hong Kong and Japan are the absolute worst I've experienced. Their online banking UI's and processes are stuck in the late 90's to early 2000's.

Japan:

your second authentication factor will be stored on this 3.5” floppy drive

[–] EssentialCoffee@midwest.social 2 points 9 months ago

Has South Korea moved on from Internet Explorer for their banking yet?

[–] itsnotits@lemmy.world -3 points 9 months ago (1 children)
  • online lbanking UIs*
  • the late '90s*
  • early 2000s*
[–] Silentiea@lemm.ee 1 points 9 months ago

It's stylistically acceptable to put an apostrophe for plurals in cases where the plural thing isn't a "normal" word, as is the case for initialisms like UI or numbers like the latter two you caught.

Obviously a given body may make its own rules in this regard, but luckily English has no overall authority, and this is informal communication outside the domain of any minor ones (beyond, perhaps, idle pedants and prescriptivists).

[–] meat_popsicle@sh.itjust.works 6 points 9 months ago

lol Finance is sometimes hilariously low tech. Lookup how ACH works, it’s a fucking farce.

[–] itsnotits@lemmy.world 5 points 9 months ago (2 children)
[–] theskyisfalling@lemmy.dbzer0.com 5 points 9 months ago

Good catch, autocorrect is a bastard :p

[–] Silentiea@lemm.ee -1 points 9 months ago

It's yes tits, I think.

[–] Lmaydev@programming.dev 2 points 9 months ago* (last edited 9 months ago)

Somewhere I worked the CEOs email got hacked and they asked the head of finance to change the bank account details for a 100k payment that was due to go out.

Luckily they thought to double check with them. But it came really close to happening.

This all happened via a phishing email.

Social engineering is how most hacks happen. Doesn't matter what protection you put in place. People are always the weakest link.

[–] Cornelius_Wangenheim@lemmy.world 1 points 9 months ago

Or just have everyone's phone number on file and pick up the phone and call them first.

[–] redcalcium@lemmy.institute 26 points 9 months ago* (last edited 9 months ago)

Acting senior superintendent Baron Chan Shun-ching of the Hong Kong police emphasized the novelty of this scam, noting that it was the first instance in Hong Kong where victims were deceived in a multi-person video conference setting. He pointed out the scammer's strategy of not engaging directly with the victim beyond requesting a self-introduction, which made the scam more convincing.

The police have offered tips for verifying the authenticity of individuals in video calls, such as asking them to move their heads or answer questions that confirm their identity, especially when money transfer requests are involved. Another potential solution to deepfake scams in corporate environments is to equip every employee with an encrypted key pair, establishing trust by signing public keys at in-person meetings. Later, in remote communications, those signed keys could be used to authenticate parties within the meeting.

If you're a rank-and-file employee in a virtual meeting with your company's top brass, it probably won't occur in your mind to ask them to turn their heads to see if it'll glitch. The scammers can just act offended and ignore your request instead. Chance that you're going to fear for your employment and apologize profusely.

The key exchange mechanism suggested by the article sounds impractical because the employees from HK likely never meet the CFO from UK in person. Maybe the corporate video conferencing system should have a company-wide key registry, but if the scammers managed to hack in and insert their own key or steal a top brass's video conferencing accounts, then it'll probably moot.

[–] Sunforged@lemmy.ml 18 points 9 months ago (1 children)

This is incredible. And scary. And incredible. I would hate to be the poor sap that fell for it though, oof.

[–] EdibleFriend@lemmy.world 8 points 9 months ago

Sounds like he just picked a big batch of oopsie daises

[–] PeroBasta@lemmy.world 12 points 9 months ago (1 children)

I'd like to hear the whole story, like how hold was the scammed guy etc.

To me it smells like he was an accomplice, or a very old person who is full of his company shit.

[–] redcalcium@lemmy.institute 6 points 9 months ago

These new realtime deepfake system is very good. DeepFaceLive was one of the example. It can generate deepfake in realtime from just a single photo, and even more convincing deepfake if you have thousands of target's photos/video frames to train the deepfake models. It's not surprising if someone could fall for it if they're not aware of the technology.

[–] someguy3@lemmy.world 11 points 9 months ago

Used to be so easy to spot scams and fakes, this stuff now is getting scary. I wonder if this will slow things down as we require face to face and in person confirmation.

[–] jet@hackertalks.com 8 points 9 months ago (1 children)

And why can't they claw back the funds through traditional banking? It's not like they sent crypto to a unknown address

[–] meat_popsicle@sh.itjust.works 1 points 9 months ago* (last edited 9 months ago) (1 children)

The money left the account 1 second after it hit, that bank likely has nothing to return.

[–] jet@hackertalks.com 1 points 9 months ago

It would be nice if the reporter asked the relevant authorities and gave thet statement

[–] autotldr@lemmings.world 7 points 9 months ago

This is the best summary I could come up with:


Deepfakes utilize AI tools to create highly convincing fake videos or audio recordings, posing significant challenges for individuals and organizations to discern real from fabricated content.

This incident marks the first of its kind in Hong Kong involving a large sum and the use of deepfake technology to simulate a multi-person video conference where all participants (except the victim) were fabricated images of real individuals.

Despite initial doubts, the employee was convinced enough by the presence of the CFO and others in a group video call to make 15 transfers totaling HK$200 million to five different Hong Kong bank accounts.

The high-tech theft underscores the growing concern over new uses of AI technology, which has been spotlighted recently due to incidents like the spread of fake explicit images of pop superstar Taylor Swift.

Over the past year, scammers have been using audio deepfake technology to scam people out of money by impersonating loved ones in trouble.

The police have offered tips for verifying the authenticity of individuals in video calls, such as asking them to move their heads or answer questions that confirm their identity, especially when money transfer requests are involved.


The original article contains 519 words, the summary contains 192 words. Saved 63%. I'm a bot and I'm open source!

[–] Madeyro@lemmy.dbzer0.com 6 points 9 months ago

That's scary but at the smme time super cool. I mean kudos to the thieves, they did the hard work apparently.