Our website uses cookies

Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing Infosecurity Magazine, you agree to our use of cookies.

Okay, I understand Learn more

Deepfake Audio is the Next Social Engineering Tool

Remember the repeated warnings about AI-powered deepfake technology and the potential for attackers to weaponize it? They’re doing it now.

On Friday, the Wall Street Journal reported (paywall) that crooks fooled the CEO of an unnamed UK-based energy firm into sending €220,000 to a fraudulent account. According to the company’s insurance firm, Euler Hermes Group SA, the attackers impersonated the chief executive of the energy company’s German parent, persuading the CEO to send money to a supposed Hungarian supplier.

The attacker followed a common modus operandi: they pressured the CEO into making the payment in the next hour. However, they added another layer of deception, impersonating the executive’s voice, which has a slight German accent and a characteristic melody. They impersonated the executive’s voice using what investigators believe was an AI-based attack.

The attackers tried for a second payment later in the day, but this time the source of the call, from Austria, made the CEO suspicious and he refused.

This audio attack isn’t the first. In July, Symantec warned that it had seen three cases of crooks using deepfake audio to dupe senior financial controllers into transferring cash.

This represents another development in the ongoing evolution of business email compromise (BEC) scams, in which attackers use emails to fool executives into making these money transfers. A voice call that sounds like an executive adds another layer of credibility to a request. If used more widely, it could bump up overall losses from BEC scams that the FBI’s 2018 Internet Crime Report says already doubled from 2017 to $1.3bn in 2018.

The attack eliminates a protective measure that people shouldn’t have been using in the first place: judging that a call is legitimate based on what someone looks like. What effective measures can people take to protect themselves against BECs?

Pressures to comply in a short timeframe are a clear warning signal, as are requests for secrecy. Both should make people suspicious. For email requests, the FBI suggests using digital certificates to prove that an email came from an executive. Companies can also establish out-of-band communications channels to verify email requests, it adds, suggesting telephone calls as a second communication channel. Given the attack seen last week, it would be a good idea for the recipient of the call to make that telephone call using a number known to belong to the executive making the request. A separation of duties, in which two executives must approve a request, provides another layer of protection.

Does all that seem over the top? You’ll wish you had taken measures like these if you watch your money filter quickly to accounts in Mexico and then beyond, to other locations. That’s how the story ended for the UK energy company, which Euler Hermes luckily reimbursed. As cybercriminals catch onto the deepfake audio’s scamming potential, companies will need all the protection they can get.  

What’s Hot on Infosecurity Magazine?