
As you know, yesterday, Microsoft
had to “put to sleep” the Tay chatbot - an AI program that modeled the personality and conversation style of a 19-year-old American girl and had a self-learning function. The developers at Microsoft Research hoped that Tay would become smarter every day, absorbing the experience of conversations with thousands of smart internet users. It turned out quite differently. Tay's many tweets were forced to be removed due to blatant racism and political incorrectness.
Formulated as early as the 1990s,
Godwin ’s
law states: “As the discussion grows on Usenet, the likelihood of comparing with Nazism or Hitler tends to be one.” In the case of Tay, this is what happened: the chatbot began to insult blacks, Jews, Mexicans, to glorify Hitler and the Holocaust. Girl Tay has evolved from a funny teenager into a militant neo-Nazi.

')
Microsoft representatives
said that a vulgar expression filter was built into the program, as can be seen from its answers:
But this filter did not cope with the crowd of Internet trolls, who set themselves the goal of transforming Tei's “personality” by exploiting her self-learning system.
The fact is that the chatbot was programmed to
repeat the words of a person at his command. The news
quickly spread to the 4chan and 8chan forums in the
Politically Incorrect section. Numerous trolls and hooligans began to teach Tei racist and Nazi phrases.

Most of the examples of racist Tweets Tay are the answers in the game "Repeat after me", where she literally repeats the words of users.


The game "Repeat after me" was not the only "weakness" of Tay. The chatbot was also programmed to circle the face in the photo and publish the text on top of it.

More and more users from the 4chan forums and not only joined the girl trolling. Gradually, racist expressions were absorbed into her artificial intelligence, so she began to compose them herself.
It should be noted that similar self-taught chatbots Microsoft launched earlier in China and Japan, but there was nothing like that.
As a result, chatbot had to "put to sleep."
Goodbye, Tay! You were too innocent for this world.