Microsoft has announced that it will be limiting the number of questions users can ask to the chat model in the revamped Bing search engine. Microsoft limits Bing AI for various reasons.
The Redmond technology giant has been testing its ChatGPT-powered model for Bing with a select group of users, who signed up to among the first ones to test drive the new experience.
The first few days of the test have thrown up some interesting observations. People found that the chat bot was prone to errors, and in some cases, bizarre responses that seem to be emotionally manipulative.
Microsoft then put out a blog post explaining that long chat sessions tend to confuse the bot, and any chats with more than fifteen turns have resulted in strange answers.
The company has announced that it will be limiting conversations to five turns per session, and fifty chat turns per day. Microsoft describes a ‘turn’ as a conversation exchange with a user query and reply.
In a blog post, the company said that the internal data showed that the vast majority of users could, “find the answers,” they were looking for, “within 5 turns”.
After five turns, the model will automatically prompt you to change the topic of the conversation. Microsoft said that at the end of each turn, users would need to scrub the context data, “so the model won’t get confused”.
It is final that Microsoft limits Bing AI. However, it also added a small broom icon that users can click to do this, similar to how you would erase browser history.