Ever received an email from someone claiming to be your long-lost relative, stranded without their wallet and passport during a vacation, and begging for $1,000 to make it back home? We all know it's a classic scam. But let's be honest, even the savviest among us can fall for something like that, and AI is making it even tougher to tell when someone's trying to take us for a ride. The FTC (Federal Trade Commission) is now giving us a heads-up about a fresh con game where scammers actually call unsuspecting victims, pretending to be their own family members and using AI tools to make it more convincing.
Federal Trade Commission Chair Lina Khan (via Bloomberg) said at a recent event that we «need to be vigilant early» as AI tools develop, because we've already seen how these AI tools can «supercharge» fraud.
Khan was talking about crooks grabbing snippets of people's voices from social media and training AI to mimic them. They use AI-powered text-to-speech software to feed its lines and make it sound like their targets' distressed relatives. It's basically the same technique that those mischievous 4chan users used to make AI-generated voice clips of celebrities saying all sorts of terrible things. The scam typically involves the relative asking for money to get home or claiming that they are in jail and need bail money—anything to get the financial information of a sympathetic relative.
Khan is concerned that further development of voice-mimicking technology could lead to a surge in scams and other harmful activities, affecting things like civil rights, fair competition, consumer protection, and equal opportunity.
Best gaming PC: The top pre-built machines from the prosBest gaming laptop: Perfect notebooks for mobile
Read more on pcgamer.com