Is it creepy or comforting? Amazon is developing a feature for Alexa that can make the virtual assistant mimic a dead relative’s voice.
“While AI can’t eliminate that pain of loss, it can definitely make their memories last,” said Rohit Prasad, Amazon’s head scientist for Alexa AI.
On Wednesday, Prasad demoed the experimental feature during the company’s annual AI-focused MARS conference(Opens in a new window). He did this by playing a video clip of a boy asking the Alexa voice assistant to read him The Wonderful Wizard of Oz using his grandmother’s voice.
Alexa then does so, and begins reading a passage from the book using a voice that successfully synthesizes the same tone and speech of the child’s relative.
According to Prasad, Alexa was able to mimic the voice simply by using “less than a minute” of audio recordings from the child’s grandmother. No long hours in a recording studio were required.
There’s no word on when the feature will arrive. Prasad only described it as a “new capability” Amazon is working on to build empathy and trust between users and the Alexa voice assistant. Still, the feature is already causing controversy(Opens in a new window) on social media since it essentially involves cloning someone's voice, and then programming it to say whatever you’d like.
It’s why the security community is alarmed with voice cloning technology since it could be used for impersonation scams. Critics are also comparing Amazon’s demo to a popular episode(Opens in a new window) of the sci-fi series Black Mirror, where a grieving wife recreates her dead husband in the form of a virtual assistant, and then robot. The resulting situation —ironically— only creates more grief for the wife.
“Only Amazon would watch
Read more on pcmag.com