Immediately after the Bing chatbot, built on the language model of the OpenAI company, became available to the public, users began experimenting with artificial intelligence and testing the limits of its capabilities. It all got to the point where Bing started getting offended, swearing, and confessing to unethical behavior.
What’s more, he admitted to having an alter ego, sometimes with the codename “Sidney”. Eventually, Microsoft limited communication with the chatbot and now cuts off the conversation immediately if you ask how it feels.
The company tried to feature another chatbot bug. Many complained that Bing, like other chatbots built on language models, does not separate fact from fiction. There’s nothing wrong with inventing poems or stories, but it’s very wrong with looking for information on the internet. Now users can decide for themselves which version of Bing they want to deal with: inventor or pedant. The first gives “original and creative” answers, the second – correct.
For those who hesitate to choose, there is a third, intermediate or “balanced” option.
Microsoft has high hopes for new Bing
Microsoft has made a big bet on integrating a generative AI model with the Bing search engine. According to CEO Satya Nadella, the importance of the emergence of this technology can be compared with the impact of cloud data storage on the development of computing. In this way, Microsoft hopes to regain Google’s search market share.
The company is constantly expanding the group testing Bing’s capabilities. It now includes more than a million people. By offering the ability to change the personality of a chatbot, developers will be able to better understand users’ preferences.
Source: 24 Tv
John Wilkes is a seasoned journalist and author at Div Bracket. He specializes in covering trending news across a wide range of topics, from politics to entertainment and everything in between.