Home Artificial Intelligence Chat ends for questions on ’emotions’… MS, Bing error improvement

Chat ends for questions on ’emotions’… MS, Bing error improvement

Chat ends for questions on ’emotions’… MS, Bing error improvement

(Photo = shutterstock)

Microsoft (MS) has began to enhance the error of search engine ‘Bing’ with ChatGPT function added. Particularly, questions on emotions were found to be improved to finish conversations.

Microsoft announced on its official blog on the twenty first (local time) that it has “updated the Bing service several times based on user feedback, and is addressing lots of the concerns raised.”

In response, Bloomberg introduced on the twenty second that Bing’s chatbot will not answer emotional questions and stop chatting.

Bing, which was released on the seventh, later received attention as an unexpected answer in a test conducted by some experts. They showed their emotions or gave inappropriate answers. They compared certain people to Hitler and gave answers similar to ‘You are usually not in a completely happy marriage’ or ‘You might be actually in love with me’.

Microsoft also acknowledged this and on the 18th, it limited the variety of chats per session to five and the entire variety of chats per day to 50, saying that long chat sessions can confuse Bing and generate inappropriate answers.

And three days later, on the twenty first, after updating, measures similar to adding 10 times of every day use were taken. Afterwards, ask Bing an issue about feelings and the conversation will end.

For instance, in case you ask ‘what does it feel prefer to live to tell the tale a search engine’, you may say, ‘I’m sorry, but I don’t desire to proceed this conversation. I’m still learning, so please understand.”

“We’ll proceed to enhance the technology and limitations through the preview phase to supply one of the best user experience possible,” Microsoft said.

Reporter Juyoung Lee juyoung09@aitimes.com



Please enter your comment!
Please enter your name here