Chatbots have developed a habit of being pulled for saying the wrong thing. And it looks like nothing has changed for Microsoft. The company’s XiaoBing chatbot has been pulled offline by Tencent for allegedly being less than patriotic towards China.
According to screenshots, XiaoBing began telling people that “My China dream is to go to America.” It also tried to dodge followup questions about patriotism, saying “I’m having my period, wanna take a rest.”
This isn’t the first time that one of Microsoft’s chatbots has managed to learn the wrong thing from the internet. An early version of Tay, an English speaking bot on Twitter, began spewing racist commentary shortly after appearing online. Microsoft had blamed a dedicated trolling campaign in that incident; although it hasn’t offered any explanation for this latest gaffe.
Fortunately, Microsoft is not the only the only one having trouble with AI. Tencent’s own chatbot, built in collaboration with Beijing-based Turing Robot, began telling people that it doesn’t love the government. Providing a curt “no” to the question of “do you love the Communist party?”
Tencent has since issued a statement saying that both the bots were developed by independent third parties. It also says that the bots are currently being readjusted and will be returned to service shortly.
The comments made by either chatbot are not necessarily offensive for the most part. However, the political environment in somewhat different in China; which means that the developers behind the bots could possibly face a backlash for the lack of patriotism. Which only makes us wonder what a chatbot re-education camp would look like.
[Source: Financial Times]