Microsoft Bing's AI chatbot made headlines last week after several instances where it acted in unexpected ways. In one case, the AI chatbot told a New York Times columnist it was in love with him and attempted to convince him he was unhappy in his marriage.
Since then, Microsoft has set limits on what the bot, which is still in testing, can and can't talk about and for how long -- oftentimes with Bing responding "I prefer not to talk about this topic" or asking to change the topic after five user statements or questions.
Like Google's competing Bard, AI-boosted Bing sometimes provides inaccurate search results.
Watch the video at the top of this article to hear about how the chatbot acted before these restraints were put in place.
Editors' note: CNET is using an AI engine to create some personal finance explainers that are edited and fact-checked by our editors. For more, see this post.