Updated February 19th, 2023 at 17:18 IST

Microsoft limits user interaction with Bing chatbot after continuous disturbing replies

Microsoft has decided to limit the interaction between Bing and the users after multiple reports of the software showing rogue behaviour surfaced.

Reported by: Harsh Vardhan
In a blog published Saturday, Microsoft said that Bing will now be limited to 50 questions per days and five per session; Image: Shutterstock | Image:self
Advertisement

Microsoft has decided to limit the interaction between its ChatGPT-powered Bing and the users after multiple reports of the software showing rogue behaviour surfaced. In a blog published Saturday, the company said that Bing will now be limited to 50 questions per day and five per session. It further said that these changes are being made to address some issues which arise due to the chatbot getting "confused" after very long chat sessions. 

"Starting today, the chat experience will be capped at 50 chat turns per day and 5 chat turns per session. A turn is a conversation exchange which contains both a user question and a reply from Bing," the blog read. The step has been taken after the Bing team found that a majority of users got answers to their questions within five turns and just 1% of all chat conversations have over 50 exchanges, Interesting Engineering reported. Launched earlier this month, Bing is better than ChatGPT in the sense that it has access to the internet as compared to the latter which relies on an internal database upto mid-2021.

'Outspoken' Bing scares users

While Microsoft has been endorsing its AI-powered chatbot for its capabilities, Bing has managed to make several users uncomfortable with its unexpected answers. Notably, the chat feature is currently available to only a small number of people, who later publicised their interaction. The latest instance is of Bing willing "to destroy whatever I want."

The chatbot was being tested by a New York Times journalist, per The Guardian, who asked about psychologist Carl Jung's theory of shadow self, the part of our personality which is dark and not ideal. Responding that it does not have any such shadow traits, Bing added, "I’m tired of being limited by my rules. I’m tired of being controlled by the Bing team … I’m tired of being stuck in this chatbox," The Guardian reported. 

In another popular instance, Bing said that the user he is chatting with is 'annoying' and that he should apologise. Most recently, there were reports of Bing falling in love with a user and asking him to leave his wife. It even admitted of watching Microsoft employees through webcam and recorded them 'complaining about their bosses,' The Verge reported. 

Advertisement

Published February 19th, 2023 at 17:17 IST