The FBI has issued an advisory warning of an expected increase in the use of deepfakes for social engineering attacks. Deepfakes are images, videos, audio, or text created via AI to produce extremely convincing imitations of real people.
“Malicious actors almost certainly will leverage synthetic content for cyber and foreign influence operations in the next 12-18 months,” the advisory states. “Foreign actors are currently using synthetic content in their influence campaigns, and the FBI anticipates it will be increasingly used by foreign and criminal cyber actors for spear phishing and social engineering in an evolution of cyber operational tradecraft.”
The FBI says threat actors will also incorporate deepfakes into sophisticated business email compromise scams.
“Synthetic content may also be used in a newly defined cyber attack vector referred to as Business Identity Compromise (BIC),” the FBI says. “BIC will represent an evolution in Business Email Compromise (BEC) tradecraft by leveraging advanced techniques and new tools. Whereas BEC primarily includes the compromise of corporate email accounts to conduct fraudulent financial activities, BIC will involve the use of content generation and manipulation tools to develop synthetic corporate personas or to create a sophisticated emulation of an existing employee. This emerging attack vector will likely have very significant financial and reputational impacts to victim businesses and organizations.”
Fortunately, many deepfakes aren’t perfect (at least not yet), and the Bureau offers the following advice to help recognize this content.
“Visual indicators such as distortions, warping, or inconsistencies in images and video may be an indicator of synthetic images, particularly in social media profile avatars,” the advisory says. “For example, distinct, consistent eye spacing and placement across a wide sample of synthetic images provides one indicator of synthetic content. Similar visual inconsistencies are typically present in synthetic video, often demonstrated by noticeable head and torso movements as well as syncing issues between face and lip movement, and any associated audio. Third-party research and forensic organizations, as well as some reputable cyber security companies, can aid in the identification and evaluation of suspected synthetic content.”
The FBI also recommends using the SIFT framework to help identify these attacks.
“Finally, familiarity with media resiliency frameworks like the SIFT methodology can help mitigate the impact of cyber and influence operations” the FBI says. “The SIFT methodology encourages individuals to Stop, Investigate the source, Find trusted coverage, and Trace the original content when consuming information online”
People have tended to worry about deepfakes because of their potential use in disinformation campaigns and influence operations. But as the FBI points out, they can’t be overlooked as a social engineering tool. New-school security awareness training can give your employees a healthy sense of suspicion so they can avoid falling for evolving social engineering tactics.
The FBI has the story.