Strawberry Recall Best Plant-Based Bacon Unplug Energy Vampires Apple Watch 9 Rumors ChatGPT Passes Bar Exam Your Tax Refund Cheap Plane Tickets Sleep and Heart Health
Want CNET to notify you of price drops and the latest stories?
No, thank you

Microsoft Limits Bing's AI Chatbot After Unsettling Interactions

After reports of the chatbot going off the rails, Microsoft has limited its approved topics and number of responses.

Microsoft Bing's AI chatbot made headlines last week after several instances where it acted in unexpected ways. In one case, the AI chatbot told a New York Times columnist it was in love with him and attempted to convince him he was unhappy in his marriage. 

Since then, Microsoft has set limits on what the bot, which is still in testing, can and can't talk about and for how long -- oftentimes with Bing responding "I prefer not to talk about this topic" or asking to change the topic after five user statements or questions.

Like Google's competing Bard, AI-boosted Bing sometimes provides inaccurate search results

Watch the video at the top of this article to hear about how the chatbot acted before these restraints were put in place.

Editors' note: CNET is using an AI engine to create some personal finance explainers that are edited and fact-checked by our editors. For more, see this post