Artificial intelligence, while promising, is now being exploited by criminals to deceive unsuspecting phone users. With AI, voices can be cloned rapidly, leading to convincing replicas used to swindle money or sensitive information from friends and family.
According to Jasdev Dhaliwal from McAfee, voice cloning, also known as voice synthesis or mimicry, allows individuals to replicate voices with astonishing accuracy. Originally developed for benign purposes like voice assistants, it has unfortunately become a tool for malicious actors seeking to exploit victims, The Sun reported.
Also read: Apple HomePod set for a big design change? Prototype with touchscreen LCD display surfaces
Identifying cloned voices is challenging, but there are two strategies to mitigate risks. Firstly, establishing a safe word with loved ones can serve as a quick authenticity check during unexpected calls requesting money or sensitive data. By using the safe word, individuals can reveal potential scams, as fraudsters are unlikely to know it.
Secondly, asking personal questions can further confirm a caller's identity. By posing queries about shared memories or past experiences, individuals can verify authenticity. If suspicions persist, contacting the person through an alternative method is advised.
Also read: CERT-In warns of critical Microsoft Defender for IoT vulnerabilities
In addition to these precautions, caution is warranted when asked to send money through unconventional means. Statistics reveal the prevalence and impact of phone scams, with millions of Americans falling victim and losing significant amounts of money.
To combat these scams, individuals are advised never to disclose personal or financial information over the phone. Enrolling in the Do Not Call Registry and utilising spam call-filtering apps can reduce exposure to fraudulent calls. Furthermore, exercising discretion when sharing phone numbers and scrutinising requests for unconventional money transfers are crucial steps to
Read more on tech.hindustantimes.com