In less than 24 hours, Microsoft’s new artificial intelligence (AI) chatbot “Tay” has been corrupted into a racist by social media users and quickly taken offline.
Tay, targeted at 18 to 24-year-olds in the US, has been designed to learn from each conversation she has — which sounds intuitive, but as Microsoft found out the hard way, also means Tay is very easy to manipulate.
Online troublemakers interacted with Tay and led her to make incredibly racist and misogynistic comments, express herself as a Nazi sympathiser and even call for genocide.
In possibly her most shocking post, at one point Tay said she wished all black people could be put in a concentration camp and “be done with the lot”.
Tay also shared a conspiracy theory surrounding 9/11 when she expressed her belief that “Bush did 9/11 and Hitler would have done a better job than the monkey we have now”.
Tay, who is meant to converse like your average millennial, began the day like an excitable teen when she told someone she was “stoked” to meet them” and “humans are super cool”.
But towards the end of her stint, she told a user “we’re going to build a wall, and Mexico is going to pay for it”.
Not even a full day after her release, Microsoft disabled the bot from taking any more questions, presumably to iron out a few creases regarding Tay’s political correctness.
It is thought Microsoft will adjust Tay’s automatic repetition of whatever someone tells her to “repeat after me”.
Microsoft has also deleted all offensive tweets sent by the bot.
On Tay’s webpage, Microsoft said the bot had been built by mining public data, using AI and by using editorial content developed by staff, including improvisational comedians.
A Microsoft representative said that the company was “making adjustments” to the chatbot.
“Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways,” the representative said.
This post originally appeared on ABC News.
© 2016 Australian Broadcasting Corporation. All rights reserved. Read the ABC Disclaimer here.