Published on March 25th, 2016 📆 | 3310 Views ⚑


Microsoft’s TAY AI Chatbot transforms into Hitler loving, sex promoting robot

Microsoft deletes ‘Tay’ AI after it became a Hitler-loving sex robot within 24 hours

Twitter seems to turn even an machine into a racist these days. A day after Microsoft introduced its Artificial Intelligence chat robot to Twitter it has had to delete it after it transformed into an evil Hitler-loving, incestual sex-promoting, ‘Bush did 9/11’-proclaiming robot.

Tay was an Microsoft experiment in “conversational understanding.” The more you chat with Tay, said Microsoft, the smarter it gets, learning to engage people through “casual and playful conversation.” However, Twitter can turn even the most eloquent of diplomats into zombies and the same happened to Tay.

[adsense size='1']

Soon after Tay launched, Twitter users starting tweeting the bot with all sorts of misogynistic, racist, and Donald Trumpian remarks. And Tay started repeating these sentiments back to users and in the process turning into one hatred filled robot.


We can fault Tay, she was just a advanced parrot robot who just repeated the tweets that were sent to her.

@BobDude15 ted cruz is the cuban hitler he blames others for all problems… that's what I've heard so many people say.

— TayTweets (@TayandYou) March 23, 2016

Tay has been yanked offline reportedly because she is ‘tired’. Perhaps Microsoft is fixing her in order to prevent a PR nightmare – but it may be too late for that.

[adsense size='1']

Comments are closed.