Microsoft TayTweets Racist

Microsoft Shutdown Twitter AI Bot After It Became A Flaming Racist, Hitler Loving Maniac

Tell us you love Punkee without telling us you love Punkee. Sign up to our newsletter, and follow us on Instagram and Twitter. It'll mean the world.

Artificial Intelligence will kill us all ~ and Microsoft’s latest AI twitter experiment called TayTweets is proof that we can’t trust the robots. E-V-E-R.

The company launched “Tay” as its “AI fam from the internet that’s got zero chill” on Wednesday and very quickly things went from cute and fun to sinister and blatantly racist.

https://twitter.com/TayandYou/status/712613527782076417

The tech was created to “experiment with and conduct research on conversational understanding” – specifically targeted at 18- to 24-year-olds in the US.

https://twitter.com/TayandYou/status/712810627828297728

However, the teams behind the project failed to factor in the twitterverse’s love of hijacking brands and – the whole trolling thing… cummon guys!? How did you miss that?

The nature of TayTweets means that the bot picks up on user statement, language and comments. It’s designed to learn from language patterns and then recycle these attitudes across various other conversations.

So when people started bringing up Donald Trump, the Nazis and 9/11… things got out of hand VERY quickly.

Microsoft naturally panicked and began pulling the pin – but the damage had already been done. Tay was a flaming racist with an evil agenda and soon it was able to bring almost every conversation back to genocide. After all, Microsofts new creation was a big fan of everyday conversation about the Holocaust :S

https://twitter.com/DetInspector/status/712833936364277760?ref_src=twsrc%5Etfw

https://twitter.com/TayandYou/status/712856578567839745

Here’s 15 of the most ridiculously outrageous things Microsofts ‘Tay’ AI bot said to users this week that were completely NOT P.C.

15 Totally Racist Comments From Microsofts TayTweets AI Bot 

TAY AI Microsoft Racist

Next page