This AI created image of a video call seems appropriate. Looks awful, doesn’t it?
Sci-fi is upon us. Those movies we grew up watching are becoming reality. Today we’ll be discussing a successful Hong Kong deepfake scam that earned the criminals $25.6 million.
For those that might have forgotten, deepfakes involve the use of artificial intelligence to manipulate visual and audio data to generate highly realistic and often deceptive content. So, essentially, using AI to create convincing videos or audio.
You may have seen videos of say, Trump singing “I’m a Barbie Girl” and that’s the kind of stuff we are talking about.
I particularly love heist movies and so this story scares me as much as it fascinates me. Here’s what happened:
$25.6 million AI-heist
A Hong Kong-based multinational company was swindled of HK$200 million (US$25.6 million) in a scam involving deepfake technology.
Scammers used digitally recreated versions of the company’s CFO and other employees in a convincing video conference, instructing an employee to transfer funds.
My friend, these guys had a video meeting with an employee of the unnamed company where the only real person was that employee. There were multiple deepfakes of other company employees that fooled this particular employee into thinking they were real.
In the meeting, the fake meeting, the employee was instructed to transfer funds totaling $25.6 million to 15 different bank accounts.
How was this person fooled tho?
If you’ve seen some of these deepfakes, they are not that convincing. So, how could the scammers fool the employee? It could be that the technology has improved, after all, AI is rapidly improving. I think though that the video call is the best way to fool someone.
We’ve all been in Zoom calls were every single person looks grainy and pixelated, even when you’re on good internet. So, deepfakes can hide behind that, choppy video and out of sync audio will not raise questions on a video call.
So, I can see myself being fooled as well. Especially if it’s just a boss I don’t interact with that personally. I won’t be able to pick up on strange word choices and the like.
That was the scammers plan I reckon. Hong Kong police say the scammers did not engage directly with the victim beyond requesting a self-introduction, which made the scam more convincing.
The first of many
I’m afraid this is just one of the first of many high stakes deepfake scams we are going to see. The tech has improved, many of our interactions are on the internet now and so those two things guarantee that we will see more of this.
We will have to rethink our security protocols.
In the Hong Kong company’s case, I’m surprised that this was all it took to authorise transfers of $25.6 million. The scammers first sent an email with the transfer instructions and then initiated a video call. The employee says they found the email suspicious but the video meeting quelled their doubts.
This is what video calls of the future look like, as Hong Kong police advise. Request people to move their heads or answer some questions to confirm their identity, particularly when financial transactions are at stake.
“Uh, sorry boss but I need you to move your head, to the left, to the right. Okay, now look up and then down and we can get on with the meeting.” “Actually, tell me the colour of your coffee mug and we are golden.”
Video calls are going to start out with exercises and a game of 20 questions. It’s gonna be fun.
There’s a more technical solution to the problem but it’s no fun, it doesn’t involve you commanding your boss to move their head.
The approach involves outfitting each employee with an encrypted key pair. Trust can be established by having individuals sign public keys during face-to-face meetings. These signed keys can then be utilised for authenticating participants in subsequent remote communications. Boo, not fun.
What’s your take?