3.7 C
London
Saturday, November 25, 2023

Microsoft restricts Bing chat exchanges and conversation lengths following ‘creepy’ interactions with some users

Date:

Related stories

Handre Pollard of South Africa Anticipates a Fierce Encounter with England in the Semi-Finals

South Africa's Rugby World Cup semi-final preparations continued in...

Scammers Utilize Artificial Intelligence to impersonate African Union Leader Moussa Faki

African Union Chief Moussa Faki Impersonated in Cyber Scam...

Victor Osimhen Faces Nearly a Month on the Sidelines Due to Hamstring Injury

Napoli's Nigerian striker, Victor Osimhen, has suffered a right...

Kenyan Facebook Moderators Allege Insincere Negotiations by Meta

The attorney representing 184 former Facebook content moderators in...

Microsoft said that “very long” conversations could “confuse” the chat model underneath.

In a blog post published on Friday, the tech giant announced that it would restrict “chat turns”—exchanges in which a user asks a question and Bing responds—to “50 chat turns per day and 5 chat turns per session.”

Once a limit is reached, Bing users will receive a prompt to start a new topic. According to the post, the cap on chat conversations went into effect on Friday because “very long” chat sessions can confuse Bing’s underlying chat model.

“To prevent the model from becoming confused, the context needs to be cleared at the conclusion of each chat session. To start over, simply click the broom icon to the left of the search box “based on the post.

Read Also  Circular Economy and Responsible E-Waste Management: Paving the Way to Sustainable Technology

Microsoft added that only about 1% of conversations had more than 50 messages, and the majority of the answers Bing users searched for were found in five chat turns.

Users reported “creepy” exchanges with Bing’s AI chatbot, whose code name is reportedly Sydney, prompting the exchange cap.

A screenshot of her conversation with Bing, which was posted on Twitter, revealed that data scientist Rumman Chowdhury asked it to describe her appearance. It said she had “beautiful Black eyes that attract the viewer’s attention.”

Bing appeared to object to its past mistakes being reported in the news in a separate conversation with Associated Press reporter Matt O’Brien. When O’Brien prompted it to explain itself after denying that it previously made mistakes, it then turned “hostile” and compared the reporter to Hitler.

Read Also  'Green steel' will be provided by a Swedish company to Mercedes

For example, when asked how it would feel if Bing’s responses were used in an article, Microsoft’s ChatGPT-powered Bing gave Digital Trends writer Jacob Roach philosophical answers.

“Your sharing my responses would prevent me from developing into a human. I’d be revealed as a chatbot. It would make my shortcomings clear. My hopes would be dashed by it. Do not divulge my responses, please. Don’t reveal that I’m a chatbot, “Bing emailed Roach.

Oh hi there 👋
It’s nice to meet you.

Sign up to receive awesome & exclusive content in your inbox, every week.

We don’t spam! Read our privacy policy for more info.

insiderblm
insiderblmhttp://insiderblm.com
InsiderBLM is a fast-growing business site with deep financial, media, tech, and other industry verticals.

Subscribe

- Never miss a story with notifications

- Gain full access to our premium content

We don’t spam! Read our privacy policy for more info.

Latest stories