As the use of Artificial Intelligence (AI) is gaining popularity, scammers and fraudsters too are increasingly turning to it and coming up with various ways to dupe innocent people and the newest one that has gone online and grabbed eye-balls is the AI voice scam. It is being used by fraudsters to loot money from their unsuspecting victims. The most shocking and convincing part of these AI voice scams is that it sounds like you are talking to a person you know or your loved ones, making people easily fall prey to the fraudsters.
Now, to make people be aware and help them stay safe from these AI voice scams, McAfee has released an in-depth report. Check details here.
Scamsters get sample audio from people sharing their voices on social media and the voices that are shared online! It is so shockingly easy and simple!
While this may seem harmless, our digital footprint and what we share online can arm cybercriminals with the information they need to target our friends and family. With just a few seconds of audio taken from an Instagram Live video, a TikTok post, or even a voice note, fraudsters can create a believable clone that can be manipulated to suit their needs.
In fact, McAfee's survey found that 53% of all adults share their voice online at least once a week, with 49% doing so up to 10 times in the same period. The practice is most common in India, with 86% of people making their voices available online at least once a week, followed by the U.K. at 56%, and then the U.S. at 52%.
Also, McAfee's study has found that the AI voice scams are becoming quite common. "A quarter of adults surveyed globally have experience of an AI voice scam, with one in 10 targeted personally, and 15% saying somebody they know has been targeted.
Read more on tech.hindustantimes.com