Imagine a situation where someone phones an elderly relative and it appears to be a favourite grandchild on the call, begging for money to help out in an emergency. It sounds exactly like the child and the problem really sounds urgent; what should they do? The FBI suggests they might want to say something like «What does the donkey say to the aardvark?»
Despite rapid advances, the accuracy of generative AI in making images and videos is still far off the mark at times, but voice reproduction is already remarkably good. With the use of vocal cloning on the rise, to carry out scams and fraudulent claims, the US Federal Bureau of Investigation has issued a list of tips (via Ars Technica) to help protect yourself, including the advice that you should create a secret word or phrase that only you and your family know.
Vocal cloning is a process where various audio clips of a person speaking are used to train a generative AI model so that it can then be used to replicate that person's normal speech patterns, tone, timbre, and so on. For example, it was used to create Google's 'podcast hosts' for its NotebookLM system and one would be hard-pressed to identify that it's not real people speaking when you listen to it.
One doesn't need to be an expert in AI to see how such a thing can be misused for nefarious purposes. And even though one would need genuine clips of your voice and speech mannerisms to clone you, it does mean there is a chance that someone out there could attempt to use 'you' in order to carry off a scam of some kind.
This is where proof of humanity comes in—essentially it's a password or phrase between you and your family, or more accurately, an MFA (multi-factor authentication) system where your voice is one factor and the password is another. A generative AI system is most likely going to be stumped by the bizarre and non-sensical because it wouldn't fall in line with the usual lines of predictability when it comes to talking.
Asking «What does the donkey
Read more on pcgamer.com