It sounds like a scene from a science fiction movie: During a video conference, the CEO asks an employee to transfer millions of dollars into another account. However, upon the transfer, the employee is horrified to learn he was the unwitting pawn in a multi-million dollar criminal heist. The person in the video conference wasn't the CEO—it was a technological illusion pieced together with manipulated audio and video.
Not only is this outlandish scenario plausible today, given evolving technological advances, it’s the new reality in the face of our post-pandemic digital landscape. Across all industries, a large percentage of organizations remain fully remote or hybrid. In-person interactions are becoming more scarce. Teleconference platforms are the new conference rooms. This abrupt shift into a virtual workspace offers a ripe environment for highly-sophisticated cyber attacks.
One of the most dangerous ways cybercriminals use social engineering to gain access to sensitive data, financial information, trade secrets, or other confidential material is through deepfake technology. Combatting this virtual trickery is challenging, but it can be done with the proper guidance and knowledge.
What is Deepfake Technology?
Deepfakes are computer-generated images or videos altered to look like real people. When coupled with sophisticated cloned audio of a person's voice, it can create a near-perfect simulation. Deepfake technology is often used for entertainment, but, more increasingly, it is being used for malicious purposes, such as scams or blackmail.
Deepfake technology—whether audio or visual—poses a massive threat to an organization's data and finances by manipulating innocent and unaware employees into participating in criminal activity or fraud. Many employees know if an email sounds like it's phishing for information, they are supposed to report it. But if they receive a phone call or video chat from a supervisor asking for sensitive information, will they still suspect deceit? Probably not, which is what makes deepfake technology such a lucrative weapon for cybercriminals.
Deepfake In the News
In the past few years, deepfakes have emerged in cyberattacks hitting organizations around the world.
Forbes recently reported that in early 2020, a bank manager in Hong Kong transferred $35 million in funds to another account at the phone call instructions from a company director. Since he recognized the director's voice, he didn't question the legitimacy of the request. However, it turns out the phone call was part of an elaborate heist using deepfake voice technology.
Even earlier, in March 2019, a U.K. energy firm experienced a similar fate. The CEO of the company received a phone call from someone he thought was his boss, asking him to transfer $243,000 to the account of a Hungarian supplier. The phone call ended up being a deepfake scam.
How Can I Protect My Organization?
Legislation is gradually starting to address the threat of deepfakes. Texas and California have banned deepfakes used to influence elections. The U.S. National Defense Authorization Act includes provisions addressing the problem as well.
In the meantime, there are many ways organizations can protect themselves. Knowing what signs to look for in a deepfake is the first step.
Some clues a video may be a deepfake are:
- Unnatural facial expressions, eye movement, or body movement
- Shifts in lighting
- Strange blinking or no blinking
- Awkward body posture
- Lips not synced with speech
Another way to protect your organization is by ensuring your cybersecurity software and procedures are updated. You can also take steps to educate employees about deepfakes and how to spot them.
As deepfakes continue to advance, your organization may want to consider consulting with social engineering professionals to combat the increased threat. Risk advisory experts work closely with your team to create custom, cutting-edge strategies to improve your information security control environment.
For more information about ways to protect your organization from deepfakes or other social engineering threats, please contact us.