Whore chatbot


10-May-2016 20:35

At first, Tay simply repeated the inappropriate things that the trolls said to her.

But before too long, Tay had “learned” to say inappropriate things without a human goading her to do so.

In one highly publicised tweet, which has since been deleted, Tay said: "bush did 9/11 and Hitler would have done a better job than the monkey we have now.

By far the most entertaining AI news of the past week was the rise and rapid fall of Microsoft’s teen-girl-imitation Twitter chatbot, Tay, whose Twitter tagline described her as “Microsoft’s AI fam* from the internet that’s got zero chill.” offensive stuff. Basically, Tay was designed to develop its conversational skills by using machine learning, most notably by analyzing and incorporating the language of tweets sent to her by human social media users.

Microsoft has now taken Tay offline for "upgrades," and it is deleting some of the worst tweets — though many still remain.

Join the web Turing Contest and vote for which chatterbot you think is truly the most intellectually advanced.

Tay is simply a piece of software that is trying to learn how humans talk in a conversation. The reason it spouted garbage is because racist humans on Twitter quickly spotted a vulnerability—that Tay didn't understand what it was talking about—and exploited it.