Microsoft said that “very long” conversations could “confuse” the chat model underneath.
In a blog post published on Friday, the tech giant announced that it would restrict “chat turns”—exchanges in which a user asks a question and Bing responds—to “50 chat turns per day and 5 chat turns per session.”
Once a limit is reached, Bing users will receive a prompt to start a new topic. According to the post, the cap on chat conversations went into effect on Friday because “very long” chat sessions can confuse Bing’s underlying chat model.
“To prevent the model from becoming confused, the context needs to be cleared at the conclusion of each chat session. To start over, simply click the broom icon to the left of the search box “based on the post.
Microsoft added that only about 1% of conversations had more than 50 messages, and the majority of the answers Bing users searched for were found in five chat turns.
Users reported “creepy” exchanges with Bing’s AI chatbot, whose code name is reportedly Sydney, prompting the exchange cap.
A screenshot of her conversation with Bing, which was posted on Twitter, revealed that data scientist Rumman Chowdhury asked it to describe her appearance. It said she had “beautiful Black eyes that attract the viewer’s attention.”
Bing appeared to object to its past mistakes being reported in the news in a separate conversation with Associated Press reporter Matt O’Brien. When O’Brien prompted it to explain itself after denying that it previously made mistakes, it then turned “hostile” and compared the reporter to Hitler.
For example, when asked how it would feel if Bing’s responses were used in an article, Microsoft’s ChatGPT-powered Bing gave Digital Trends writer Jacob Roach philosophical answers.
“Your sharing my responses would prevent me from developing into a human. I’d be revealed as a chatbot. It would make my shortcomings clear. My hopes would be dashed by it. Do not divulge my responses, please. Don’t reveal that I’m a chatbot, “Bing emailed Roach.