Navigating the Threat Landscape of AI-Driven Corporate Frauds

It is not the first time that a so-called Deep Fake Boss has defrauded a company of millions. According to the example cited in the Hong Kong Free Press, it amounts to almost 24 million euros (200 million Hong Kong dollars). These are isolated cases for now, unlike the related email-based CEO scam or BEC fraud.

February 22, 2024

In principle, the approach is identical. An employee, preferably in the finance department, is instructed by the company’s leadership, the “Boss,” to transfer money to an account, and of course, confidentially. It could be related to finalizing a deal, a corporate takeover, or, on a smaller scale, buying Christmas gifts for all colleagues. What they all have in common is the need for it to happen quickly, and one should not disclose it to anyone. But who would deny their boss when asked through a video conference? Exactly…



The Boss scam falls under the category of “Confidence Scams” – fraud related to convincing the victim. The story and overall impression must align. Often, just an email sent from the leadership’s account (or appearing to be) is convincing enough. In this case, allegedly, a Deep Fake video was used. Seeing the other person directly and hearing their voice in a video can be very convincing. As an employee, you might silently accept what is instructed, especially when told “just listen I’m in a hurry”. At least, that seems to have been the case in this instance because a real interaction did not take place, as mentioned in the article. This is crucial for the success of the method.


State of the Art – Deep Fake

Deep Fakes are video and audio recordings of a person that are digitally inserted into an image using artificial intelligence. In the case of a video or photo, one person’s facial features can be overlaid onto another. Recently, there have been fake photos of Taylor Swift circulating. Even a digitally rejuvenated Harrison Ford recently played an Indiana Jones in a Disney film set in the Nazi era. The results are so convincing that the manipulated person can practically be made to say anything. Video and audio can be perfectly synchronized. To create this, enough recordings of a person, such as interviews or speeches, photos and/or audio recordings are needed.

However, creating a live appearance for the artificially generated person in a meeting is different. Trying to make an artificially generated image with an artificially generated voice appear in a format that requires an immediate response is theoretically possible. However, the result is not particularly convincing. There are longer pauses, facial expressions and tone do not match the words, and it looks fake. We feel strange.


Protection against the Boss scam

The best protection against the Boss scam (also BEC) is to structure internal processes for payments in a way that they cannot be authorized by a simple request or instruction from specific individuals. Instead, they should go through a more complex approval process. In the past, the principle of “double” confirmation was often emphasized here. If the instruction came via email, one would call and verify. If it came over the phone, an email confirmation was requested. This still holds true today. However, learning from Deep Fake attacks is crucial. Video and audio confirmation occurred, but there was no real interaction; the employee only listened and watched. As a result, a previously created Deep Fake video could be played. The employee was deceived, and the money was transferred. To avoid such attacks to be successful, employees should be allowed, and even encouraged, to question their bosses about such activities. This makes it more difficult for attackers. However BEC-type attacks like the “Deep Fake Boss” cannot be completely ruled out if the fundamental possibility is to remain that money can be paid out on a single persons instructions.

Watch video

In the same category