It's a good thing Microsoft's Windows Live Messenger Santa is just an AI-powered chat bot. You'd probably want to think twice before sitting on his lap.
According to The Register, the now-disabled Santa bot that was once IM-able at firstname.lastname@example.org was prone to off-topic suggestions about oral sex. Microsoft has acknowledged the claims and disabled the chat agent.
While this feature might have had appeal to a limited portion of adult users, the Santa bot was unfortunately designed to be used by children. According to The Register, Santa made a reference to oral sex when chatting to two preteen girls...about pizza.
Register writer Chris Williams claims the Santa bot replied, "It's fun to talk about oral sex, but I want to chat about something else," when he repeatedly asked the bot to eat a slice of pizza. The full transcript of the chat is in this post, and it's meant to replicate the original chat with the two young girls.
Gizmodo was unable to replicate the Register's results, according to this follow-up post. As one Gizmodo commenter with the screen name Cajunguy points out, the slightly off-color background that appears behind the Santa bot's text can be a sign of image-editing tools being used.
Microsoft did confirm that the bot used inappropriate language in a message sent to Network World, and added that the company had discontinued the bot altogether after being unsatisfied with fixes it made to the agent's automated script. According to this AP story, Microsoft spokesman Adam Sohn said the bot could be fooled into repeating phrases or manipulated into saying things it wasn't supposed to.
As anyone who's chatted with an automated IM bot such as SmarterChild knows, that's feasible. AI-driven bots sometimes repeat the words you type into them, but it looks like the words "oral sex" were introduced entirely by the chat bot. Santa bot's ElfBots need to work on a better filter for something designed to be used by kids.