As audio deepfake technology continues to go mainstream as part of the evolution in AI-based tools, new data shows there are plenty of victims and they aren’t prepared for such an attack.
Imagine you get a call from your child or grandchild telling you they’re in some kind of trouble, an accident, etc. and need money. And it really does sound like them. Would you help?
Scammers who are making use of AI to synthesize sound-alike voices as part of vishing scam calls are hoping you will. And according to the new Beware the Artificial Impostor report from McAfee, recipients of such calls are falling victim.
Globally, 25% of respondents said either they personally have experienced a sound-alike AI voice-based scam call or knew someone personally that has. With online services like those of eleven labs, who will be offering Instant Voice Cloning to generate a synthesized voice from 30 minutes of audio samples, it’s only a matter of time before threat actors start to leverage AI voice-based scams even more.
According to McAfee, nearly half (48%) of people would help if they received a call about a car accident, 47% for a call about being a theft victim, 43% for lost wallet, and 41% for needing help on a vacation.
The worst part of this is 35% of people couldn’t tell if the voice was real or not, with another 35% having no idea whether they’d be able to tell. This means the only real context for determining if a call is a scam or not rests in the fact that the call itself is unexpected. The application for this type of scam in the business world ranges anywhere from CEO gift card scams, to digital fraud, and more – all requiring that users within the organization be continually enrolled in security awareness training so that they are ready and vigilant even when the voice on the other end of the phone sounds familiar.