Microsoft bing ai edge chatbot conversation limits – Breaking News & Latest Updates 2026
Skip to main content

Microsoft is already undoing some of the limits it placed on Bing AI

Last week, Microsoft limited how much people could talk to Bing, and after complaints, it’s expanding access again.

Last week, Microsoft limited how much people could talk to Bing, and after complaints, it’s expanding access again.

A Bing logo.
A Bing logo.
Image: The Verge
Richard Lawler
is a senior editor following news across tech, culture, policy, and entertainment. He joined The Verge in 2021 after several years covering news at Engadget.

Microsoft is already loosening restrictions it recently placed on interactions with the Bing AI chatbot and said it’s going to start testing another option that lets users choose the tone of the chat, with options for Precise (shorter, more focused answers), Creative (longer and more chatty), or Balanced for a bit of both.

After repeated reports of strange behavior (like the time it split into multiple personalities and one of them offered us furry porn) and jailbreaks that did things like expose its secret rules, Microsoft set new rules last Friday that limited the number of interactions testers could have and how long they last. The limits cut down testers to five turns per session and a max of 50 per day.

Related

According to Microsoft, longer chat sessions can make Bing “become repetitive or be prompted / provoked to give responses that are not necessarily helpful or in line with our designed tone.” Now it’s turned that back up to six chat turns per session with a max of 60 chats per day for any testers who have access, and the post on its blog says the plan is to increase the daily cap to 100 sessions soon and allow searches that don’t count against the chat total.

Microsoft has said it didn’t “fully envision” people using the Bing AI bot as social entertainment, but with even experienced tech reporters failing this version of the mirror test, it introduced new guardrails.

Ars Technica reported that commenters on Reddit complained about last week’s limit, saying Microsoft “lobotomized her,” “neutered” the AI, and that it was “a shell of its former self.” These are the people the Bing Team is responding to in its post:

These long and intricate chat sessions are not something we would typically find with internal testing. In fact, the very reason we are testing the new Bing in the open with a limited set of preview testers is precisely to find these atypical use cases from which we can learn and improve the product.

Since placing the chat limits, we have received feedback from many of you wanting a return of longer chats, so that you can both search more effectively and interact with the chat feature better.

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.