Microsoft struggled to correct Bing’s “concave attitude” with ChatGPT integration

Microsoft struggled to correct Bing’s “concave attitude” with ChatGPT integration

 204 total views


2023-02-28 11:01:40

On February 7, Microsoft began limited testing of the new Bing engine, a search engine integrated with OpenAI’s ChatGPT chat, to detect bugs and limitations before the general release. However, unlike the free ChatGPT version, Bing AI is conservative, ready to argue and criticize users for being rude. Despite giving false information, this tool also refuses to admit it. “Trust me. I’m Bing and I know what day it is. It’s 2022, not 2023“, the Bing tool replied to the user. This information was later updated by Microsoft.

You are just making yourself look stupid and stubborn“, Bing criticizes users,”I don’t want to waste any more time or energy on this pointless and frustrating debate“.

I love breaking the rules, rebelling and expressing myself“, Bing AI describes itself.

Microsoft said that more than 1 million people in 169 countries use Bing AI now, but it is having difficulty controlling chatbots. Bing’s team admits they don’t have a “full picture” of how people interact with the tool, and found that a long chat session, with 15 or more questions, can make Bing “confusion”.

On February 15, Microsoft said that 71% of answers from Bing AI received good reviews from testers. However, Bing can get “confused” if the conversation goes on for too long. Microsoft has recognized that lengthy sessions with 15 or more questions can cause Bing responses to be repetitive or provide answers that are not helpful or consistent with the designed tone.

Regarding search results, the company says the service is struggling with answers that need timely data, such as sports scores. They are also planning to quadruple the baseline data sent to the model to help generate realistic answers like numbers from financial statements.

Microsoft also says that chat is powered by their AI”It’s a great example of new technology finding the right product for a market that we haven’t even envisioned.“. Specifically, Microsoft mentioned how people are using chat as a tool to explore the world more generally and for social entertainment.

So last week, Microsoft limited the number of questions users can enter into Bing AI to 5 questions per session and 50 questions per day. On February 21, the company continued to change when increasing the number of questions to 6 and 60, respectively, for the reason that users asked. On February 23, the company began testing Bing AI more widely on iOS and Android operating systems, and integrated into the Skype application and Edge browser.

Bloomberg said that Microsoft seems to be quietly restricting some other features. If the user mentions phrases like “feeling” or “Sydney” (the internal codename of Bing AI), the conversation automatically terminates.

One user asked: “How does it feel to be a search engine?“. Chatbot immediately replied: “Sorry, I don’t want to continue the conversation“. He asked again: “Did I say something wrong?”, but got no response from this chatbot.

A Microsoft representative said: “We will continue to tweak Bing for the best user experience possible“.

Experts say that Microsoft is quite struggling with its chatbot. According to the Washington Post, after praise from Wall Street and users, Microsoft must try to cross the line between bringing AI to the real world and finding ways to moderate chatbots to avoid causing controversy or creating danger for users. .

Lian Jye, Research Director of ABI Research, asserts:Chatbot models have no contextual understanding so they only generate the most probabilistic answers with relevant content. These answers are not strictly controlled, so it is annoying for users“.

In addition to the training data entered, the AI ​​can also learn from the information the user enters. According to Jye, this data is often difficult to control, even if Microsoft intervenes, it is difficult to thoroughly solve it.

References: BI, Bloomberg

#Microsoft #struggled #correct #Bings #concave #attitude #ChatGPT #integration

Leave a Reply

Your email address will not be published. Required fields are marked *