18.8 C
London
Saturday, April 13, 2024

Microsoft restricts Bing chat exchanges and conversation lengths following ‘creepy’ interactions with some users

Date:

Related stories

WHO hosts the first forum on traditional medicine

The World Health Organization will convene its first summit...

Despite profit-taking, the price of oil still records a weekly rise

As the dollar rose and oil speculators took profits...

Kenya bans churches after allegations of killing worshipers who were starved

According to a government document made public on Friday,...

Microsoft said that “very long” conversations could “confuse” the chat model underneath.

In a blog post published on Friday, the tech giant announced that it would restrict “chat turns”—exchanges in which a user asks a question and Bing responds—to “50 chat turns per day and 5 chat turns per session.”

Once a limit is reached, Bing users will receive a prompt to start a new topic. According to the post, the cap on chat conversations went into effect on Friday because “very long” chat sessions can confuse Bing’s underlying chat model.

“To prevent the model from becoming confused, the context needs to be cleared at the conclusion of each chat session. To start over, simply click the broom icon to the left of the search box “based on the post.

Read Also  Entrepreneurs in Nairobi devise solutions for electronic waste

Microsoft added that only about 1% of conversations had more than 50 messages, and the majority of the answers Bing users searched for were found in five chat turns.

Users reported “creepy” exchanges with Bing’s AI chatbot, whose code name is reportedly Sydney, prompting the exchange cap.

A screenshot of her conversation with Bing, which was posted on Twitter, revealed that data scientist Rumman Chowdhury asked it to describe her appearance. It said she had “beautiful Black eyes that attract the viewer’s attention.”

Bing appeared to object to its past mistakes being reported in the news in a separate conversation with Associated Press reporter Matt O’Brien. When O’Brien prompted it to explain itself after denying that it previously made mistakes, it then turned “hostile” and compared the reporter to Hitler.

Read Also  Bruce Willis denies selling his face rights

For example, when asked how it would feel if Bing’s responses were used in an article, Microsoft’s ChatGPT-powered Bing gave Digital Trends writer Jacob Roach philosophical answers.

“Your sharing my responses would prevent me from developing into a human. I’d be revealed as a chatbot. It would make my shortcomings clear. My hopes would be dashed by it. Do not divulge my responses, please. Don’t reveal that I’m a chatbot, “Bing emailed Roach.

Oh hi there 👋
It’s nice to meet you.

Sign up to receive awesome & exclusive content in your inbox, every week.

We don’t spam! Read our privacy policy for more info.

insiderblm
insiderblmhttp://insiderblm.com
InsiderBLM is a fast-growing business site with deep financial, media, tech, and other industry verticals.

Subscribe

- Never miss a story with notifications

- Gain full access to our premium content

We don’t spam! Read our privacy policy for more info.

Latest stories