2016-03-24

'a racist bigot'? You mean there are non-racist bigots too?

I mean really - http://www.pcworld.com/article/3048157/data-center-cloud/the-internet-turns-tay-microsofts-millennial-ai-chatbot-into-a-racist-bigot.html

The bot has a quirky penchant for tweeting emoji and using “millenial speak”—but that quickly turned into a rabid hatefest. The Internet soon discovered you could get Tay to repeat phrases back to you, as Business Insider first reported. Once that happened, the jig was up and another honest effort at “good vibes” PR was hijacked. The bot was taught everything from repeating hateful gamergate mantras to referring to the president with an offensive racial slur.

Hey Microsoft - how's all those SDETs and automated testing working out for you?

No comments:

Post a Comment