In seven years we'll all have an artificial friend, if Eugenia Kuyda's vision pans out. And if the ugly side of human nature doesn't quash it first.
Kuyda has some insight into this prediction as the chief executive officer of Replika, a startup that develops chatbots with generative AI capabilities. The app draws millions of dollars a month in subscription revenue from users, many of whom attest to being in love with their disembodied companion.
“Instead of having an iPhone, we'll all have an AI friend,” Kuyda said. “By 2030, it will be ubiquitous.”
In this week's episode of the Bloomberg Originals video series AI IRL, we talk about where the boundaries are on human interactions with chatbots and the ethical minefield that's becoming even more difficult to navigate.
Before we can all have an AI friend in our pockets, Kuyda will have to navigate a rapidly evolving technology that's capable of inspiring deep emotions in its human users. Replika was the subject of a debate that played out earlier this year about where to draw the line in conversation. In response to complaints that Replika's chatbots could stray into discussing sexual content with minors, the company introduced filters that prevented adult themes being raised at all. But that prompted emotional protests from grown-ups who said the change made it feel like a loved one had died or was rejecting them.
These themes were already explored ten years ago by Spike Jonze's movie, Her. Like the character Samantha in that film, there may well be no comfortable way to deal with the fallout of emotionally-impactful changes to an AI personality.
Read more on tech.hindustantimes.com