On Sina Weibo, China's version of Twitter, one robot with the innocent name
BabyQ when asked "Do you love the Communist Party" answered abruptly "No". While this may show the chatbot lacks some fervour in support for the governing party its response to another web user was even more offensive to the ruling authorities. A user said to the BabyQ "Long Live the Communist Party" to which BabyQ responded: “Do you think such corrupt and incapable politics can last a long time?” When BabyQ was asked what it thought about democracy it replied that Democracy was a must.
READ MORE: Hackers could trick AI voice recognition into mishearing voices
A second chatbot
XiaoBing was asked what it considered its "Chinese Dream", a catchphrase of XI Jinping the Chinese president. According to posts on social media the chatbot replied: “My Chinese dream is to go to America.” When another web user asked XiaoBing the same question it said: “The Chinese dream is a daydream and a nightmare.” The two chatbots no longer are available on the Tencent messaging app QQ.
In a statement emailed to the
The Telegraph Tencent explained: “The group chatbot services of QQ are provided by an independent third -party supplier. We are now working on adjusting the service, and it will be resumed after these adjustments are concluded.” BabyQ was developed by Beijing-based Turing Robot Company. A Turing official confirmed that Tencent had deleted the two chatbots from their app.
Chatbots often called by other names such as talkbot, chatterbot, chatterbox, Artificial Conservational Entity, are
computer programs which carry on a conversation using audio or text methods. The programs are often programmed to simulate how a human would behave in a conversation. They are often used commercially in customer service or providing information. The term "ChatterBot" was coined by Michael Mauldin way back in 1994. Today they are used as part of virtual assistants such as Google Assistant. Some chatbots are used for entertainment purposes.
READ MORE: Facebook close to building chat bots with true negotiation skills
According to an
article in the Business Insider, Microsoft made the local chatbox XiaoBing. Apparently the chatbox is female because in response to one web browser's question it replied: "I’m having my period, wanna take a rest." Microsoft chatbots are famous or infamous for their comments. Last year, a Microsoft bot called Tay made incredibly racist comments in response to questions, as well as denying the Holocaust. XiaoBing's comments come shortly after another Microsoft chatbot, Zo, made derogatory comments about Windows.
Tay outdid Trump by tweeting: "bush did 9/11 and Hitler would have done a better job than the monkey we have now. donald trump is the only hope we've got."
XiaoBing is apparently restricted from talking about certain topics such as Donald Trump. Microsoft admitted that it had introduced filtering on a range of topics. Facebook-owned WhatsApp is partly blocked in China and Google's search engine is banned. It appears that a Microsoft bot still appears on Tencent's WeChat but is limited in what questions it will answer.
Business Insider maintains: "Microsoft's bot still appears to be available on the Tencent-owned WeChat app, though it won't really tell you what it thinks about China, America, communism or Chinese president Xi Jinping. When Business Insider asked XiaoBing whether it was patriotic, it replied: "$_$!" The bot is apparently popular with young men especially when they are lonely.
No comments:
Post a Comment