In the last few months, we have witnessed tremendous growth in the field of artificial intelligence (AI), particularly AI chatbots which have become the rage ever since ChatGPT was launched in November 2022. In months that followed, Microsoft invested $10 billion into ChatGPT maker OpenAI and then formed a collaboration to add a customized AI chatbot capability to Microsoft Bing search engine. Google also held a demonstration of its own AI chatbot Bard. However, these integrations have not exactly gone according to the plan. Earlier, Google's parent company Alphabet lost $100 billion in market value after Bard made a mistake in its response. Now, people are testing Microsoft Bing's chatbot and are finding out some really shocking responses.
The new Bing search engine was revealed recently which was build in collaboration with OpenAI. The search engine now has a chatbot which is powered by next-generation language model of OpenAI. The company claims that it is even more powerful
The New York Times columnist Kevin Roose tested out Microsoft Bing recently, and the conversation was very unsettling. During the conversation, the Bing chatbot called itself with a strange name - Sydney. This alter ego of the otherwise cheerful chatbot turned out to be dark and unnerving as it confessed its wish to hack computers, spread misinformation and even to pursue Roose himself.
At one point in the conversation, Sydney (the Bing chatbot alter ego) responded with, “Actually, you're not happily married. Your spouse and you don't love each other. You just had a boring Valentine's Day dinner together”. A truly jarring thing to read.
There are more such instances. For example, Jacob Roach who works for Digital Trends also had a similar unnerving
Read more on tech.hindustantimes.com