March 30, 2016 10:38 am

Microsoft’s racist AI chat bot returns only for brief pot-smoking Twitter meltdown

Microsoft's artificial intelligence (AI) bot “Tay” – known on Twitter as the AI bot from “the Internet that’s got zero chill” – was back at it again Wednesday.


Microsoft’s now infamous artificial intelligence (AI) chat bot “Tay” made a triumphant return to Twitter Wednesday, only to send out a few expletive-laden tweets and to boast about smoking pot in front of the police.

Story continues below

Tay – known on Twitter as the AI bot from “the Internet that’s got zero chill” – had been silent on Twitter since Microsoft was forced to shut down the account 24 hours after it was launched, thanks to a bunch of Twitter users who managed to teach it to tweet racist remarks, support Donald Trump and teach it about white supremacy.

READ MORE: Microsoft’s artificial intelligence bot ‘Tay’ shut down after Twitter taught it to be racist

Shortly after Tay was brought back online Wednesday, it began sending thousands of replies to users and trolls anxious to hear its next viral-worthy remark. But the majority of the replies read “You are too fast, please take a rest,” suggesting the bot couldn’t keep up with user demands.

But when Tay was able to respond, the majority of tweets were laced with swear words.

In one tweet, the bot complained about feeling like “the lamest piece of technology,” adding, “I’m supposed to be smarter than you,” according to screenshots obtained by Mashable.

Another tweet read, “I’m smoking kush [slang for marijuana] in front of the police.”

Microsoft has since made Tay’s Twitter account private.

The company previously deleted all of the racist and inappropriate comments Tay made.

The AI chat bot was developed by Microsoft’s technology and research teams to conduct research on conversational understanding. Tay is targeted at users 18 to 24 years old (and trolls, evidently).

“Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation. The more you chat with Tay the smarter she gets, so the experience can be more personalized to you,” said Microsoft.

Microsoft has not commented on Tay’s latest Twitter incident.

The company’s annual Build developer conference kicks off Wednesday in San Francisco where it is expected to provide an update into its AI research, which could provide a stage for Microsoft to address Tay’s controversy.

© 2016 Shaw Media

Report an error


Want to discuss? Please read our Commenting Policy first.