Oh Tay!

Photo courtesy of TayTweets on Twitter

Photo courtesy of TayTweets on Twitter

It took less than a day for Twitter to kill Tay, Microsoft’s AI bot that was supposed to interact and engage with people online. The premise seemed innocent enough, a bot that mimicked a teenager and was programmed to learn from millennial – but learn what?

Twitter trolls engaged with Tay almost immediately and her first innocent tweet of ‘Hellooooooo world!!!’ quickly took a dark turn.

Photo courtesy of TayTweets on Twitter

Photo courtesy of TayTweets on Twitter

Apparently, even though Microsoft tested Tay’s reaction ability and her ability to learn from the language she would be exposed to, they did not test Tay to see if she could combat a Twitter Troll attack that was intentionally malicious – but seriously, how could they not see that coming? Anyone that has spent more than 10 seconds on Twitter should understand what could go wrong…

Photo courtesy of TayTweets on Twitter

Photo courtesy of TayTweets on Twitter

Within a few hours, Tay began to tweet out racist, religiously offensive, and downright inappropriate comments. The reasoning that was given was that Tay was designed to ‘learn over time’ instead of being programmed with with a base knowledge. They also did not anticipate the caustic nature of Twitter and did not take into account the offensive language Tay would be exposed to.  When they designed the AI Bot they created a naive AI system that would develop over time.

Tay’s, short lived, 24-hour existence did shine a spotlight on the dark side of social media and taught developers an important lesson, always plan for a worse case scenario.

Did you interact with Tay during her brief Twitter life? Do you plan to interact with her when she resurfaces as a better, more intelligent bot?

Advertisements
This entry was posted in Microsoft, Twitter and tagged , , . Bookmark the permalink.

2 Responses to Oh Tay!

  1. sydhavely says:

    Uh, oh! Back to the algorithm board.

  2. Bruce Warren says:

    Rhythms not algorithms. Definitely a fail here.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s