The Federal Trade Commission is alerting consumers about a next-level, more sophisticated family emergency scam that uses AI which imitates the voice of a "family member in distress".
They started out with: "You get a call. There's a panicked voice on the line. It's your grandson. He says he's in deep trouble — he wrecked the car and landed in jail. But you can help by sending money. You take a deep breath and think. You've heard about grandparent scams. But darn, it sounds just like him. How could it be a scam? Voice cloning, that's how."
"Don’t Trust The Voice"
The FTC explains: "Artificial intelligence is no longer a far-fetched idea out of a sci-fi movie. We're living with it, here and now. A scammer could use AI to clone the voice of your loved one. All he needs is a short audio clip of your family member's voice — which he could get from content posted online — and a voice-cloning program. When the scammer calls you, he’ll sound just like your loved one.
So how can you tell if a family member is in trouble or if it’s a scammer using a cloned voice?
Don’t trust the voice. Call the person who supposedly contacted you and verify the story. Use a phone number you know is theirs. If you can’t reach your loved one, try to get in touch with them through another family member or their friends.
Full text of the alert is at the FTC website. Share with friends, family, and co-workers: