X

Microsoft Limits Bing's AI Chatbot After Unsettling Interactions

After reports of the chatbot going off the rails, Microsoft has limited its approved topics and number of responses.

Justin Eastzer Former Senior Video Producer
Prior to CNET, Justin_tech ran his own YouTube channel with a focus on the smart home and lighting, VR/AR, wearables and more. Diabetes tech became a new passion after his type 1 diagnosis in May 2021. If you enjoy terrible puns and emerging tech, then you'll enjoy his tech-ventures.
Expertise Five years of tech YouTube-ing and two years of experience with diabetes wearable tech, which is used daily to foster better relationships with technology. Credentials
  • Audience-building across multiple social networks, including over 5 million subscribers for America's Got Talent and a niche diabetes tech education and community building platform
Justin Eastzer

Microsoft Bing's AI chatbot made headlines last week after several instances where it acted in unexpected ways. In one case, the AI chatbot told a New York Times columnist it was in love with him and attempted to convince him he was unhappy in his marriage. 

Since then, Microsoft has set limits on what the bot, which is still in testing, can and can't talk about and for how long -- oftentimes with Bing responding "I prefer not to talk about this topic" or asking to change the topic after five user statements or questions.

Like Google's competing Bard, AI-boosted Bing sometimes provides inaccurate search results

Watch the video at the top of this article to hear about how the chatbot acted before these restraints were put in place.

Editors' note: CNET is using an AI engine to create some personal finance explainers that are edited and fact-checked by our editors. For more, see this post