Microsoft limits Bing's ChatGPT-powered chatbot to 5 questions per session
Microsoft has now set a limit of five responses per session and 50 questions per day for the ChatGPT-powered Bing chatbot. The new Bing chatbot, which is fueled by generative artificial intelligence (AI), displayed answers that were 'potentially dangerous,' implying that the technology may not be ready for widespread use. Microsoft's action comes after several users reported the chatbot's inaccuracies and unhinged behavior.
Why does this story matter?
The new Bing is a potentially lucrative opportunity for Microsoft. However, preliminary search results and interactions with its AI-powered chatbot, have revealed an unpredicted behavior. Bing has reportedly given some unexpected replies as several users shared their strange interactions on social media. Microsoft, through its blog post, appears to have made an excuse that it hadn't anticipated how people would use Bing.
"You have not been a good user," comments the chatbot
Long chat sessions can lead to provocative responses
Microsoft said that lengthy chat sessions may cause Bing's AI-powered chatbot to become repetitive or provide responses that don't fit the intended tone. The statement comes after a handful of users reported the bizarre replies by the chatbot. According to Microsoft, the interface can "become repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone."
50 questions per day, 5 per session
Microsoft has announced new conversational limits for Bing's chatbot amid reports of inappropriate behavior by the AI tool. As per the company, "the vast majority of you find the answers you're looking for within five turns and that only ~1% of chat conversations have 50+ messages. After a chat session hits five turns, you will be prompted to start a new topic."
Users will have to start new topic after one session
In order to prevent frequent pointless back-and-forth sessions, Bing's chatbot will now prompt users to start a new topic when they exceed the new per session limit. Users will also need to clear the context after each chat session to prevent the chatbot from becoming confused. They can use the broom icon (next to the search box) to clear their previous inputs.
The chatbot's odd behaviors will be corrected with time: Microsoft
Microsoft has implemented new restrictions to Bing's ChatGPT-powered chatbot to prevent it from tricking users into doubting their own opinions or experiences. Its peculiar behavior is anticipated to change over time. "One area where we are learning a new use-case for chat is how people are using it as a tool for more general discovery of the world, and for social entertainment," said Microsoft.