Chatbots are back in a big way in the form of services like ChatGPT and Bing. Whether good or bad(opens in new tab), these AIs are bringing plenty of entertainment to the internet, proving to be both weirdly effective and then completely incorrect and delusional at every turn. What we don't necessarily realise when playing with these new internet tools is just how much work has gone into getting them to this somewhat functional level. According to The Verge(opens in new tab), in Bing's case this is a bot at least 6 years in the making.
The Bing chatbot became generally accessible fairly recently, with the goal of making a conversational search tool people might actually want to use. The Bing subreddit has since exploded with many people doing just that, but often to hilarious results(opens in new tab). One of my personal favourites sees Bing become weirdly aggressive towards a user after they inform it that the newest Avatar movie is in fact out because Bing doesn't know what year it is(opens in new tab).
This is all good and fun, especially as long as people aren't taking the answers from chatbots too seriously(opens in new tab). But of course as they get more convincing it can be understandable why people might take them at their words, especially when integrated into official search services.
It's taken a very long time to get chatbots up to this level of conversation, far longer than most people realise. Microsoft has been dreaming of a conversational search AI(opens in new tab) for years, and this iteration of Bing can be traced back to about 2017. Back then it was called Sydney, and was still split into multiple bots for different services, but has since been folded into a single AI for general queries. Seeing
Read more on pcgamer.com