If you receive an unexpected phone call from a family member in trouble, be careful: The other person on the line might be a scammer using AI voice technologies to pull off an impersonation.
The Federal Trade Commission is raising alarm bells about fraudsters exploiting(Opens in a new window) commercially available voice-cloning software for family emergency scams.
These scams have been around for years and involve the culprit impersonating a family member, typically a child or grandchild. The fraudster will then call the victim, claiming they’re in desperate need of money to resolve an emergency.
The FTC now says AI-powered voice-cloning software can make the impersonation scam seem even more authentic, duping victims into handing over their funds. “All he (the scammer) needs is a short audio clip of your family member's voice—which he could get from content posted online—and a voice-cloning program. When the scammer calls you, he’ll sound just like your loved one,” the FTC says in the Monday warning(Opens in a new window).
The FTC didn’t immediately respond to a request for comment, making it unclear if the US regulator has noticed a surge in scams involving voice-cloning technologies. But the warning arrives a few weeks after The Washington Post chronicled(Opens in a new window) how scammers are abusing voice-cloning software to prey on unsuspecting families.
In one case, the scammer used the technology on a Canadian couple to impersonate their grandson, who claimed to be in jail. In another incident, the fraudsters used the voice-cloning tech to successfully swindle $15,449 from a couple, who were also fooled into believing their son had been thrown in jail.
Not helping the matter is how voice-cloning services
Read more on pcmag.com