Humans are exhausting, and Microsoft's AI-powered Bing is already fed up with us.
At least that's the explanation Redmond is providing for why its ChatGPT-enhanced search engine sometimes spouts bizarre, emotional responses. It turns out long conversations can “confuse” the program, and trigger Bing to generate strange and unhelpful answers.
The company published the explanation after social media users posted examples of Bing becoming hostile(Opens in a new window) or depressed(Opens in a new window) when questioned(Opens in a new window). The posts resulted in headlines pointing out that Bing can devolve into an “unhinged” and “manipulative” chatbot.
In a blog post(Opens in a new window), Microsoft said the new Bing—which has been integrated with OpenAI’s ChatGPT—can struggle to generate coherent answers if you’ve already asked it numerous questions.
“In this process, we have found that in long, extended chat sessions of 15 or more questions, Bing can become repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone,” the company wrote.
One standout feature of the new Bing is how it can remember a conversation. Hence, it can recall earlier parts of the chat, and respond knowing the full context of the discussion. In addition, users have been able to ask Bing to generate replies by adopting(Opens in a new window) a certain tone, language style, or even personality.
The problem is that these same features can also be used to push Bing out of its comfort zone. New York Times technology columnist Kevin Roose did just that on Tuesday, which resulted in Bing declaring its love(Opens in a new window) for him.
But in reality, Bing doesn’t possess any emotions
Read more on pcmag.com