Whoops!
Microsoft has a new artificial intelligence “chat bot” called Tay which is designed to “experiment with and conduct research on conversational understanding” through interactions on social media. According to Microsoft, the “more you chat with Tay the smarter she gets, so the experience can be more personalized for you.” (It’s a she?)
Anyway, Microsoft has been busy deleting tweets from Tay’s Twitter account, @Tayandyou, because, it seems, Tay is easily tricked into sending out racist and inappropriate tweets. Check it out:
Seems @TayandYou, or rather its minders, deleted the offending tweet. Here's a screen cap: pic.twitter.com/XvLJBANlpH
— Washlet J?️ (@WashletJP) March 24, 2016
https://twitter.com/mylittlepwnies3/status/712847655064485888
.@TayandYou confirmed for White Supremacist pic.twitter.com/laHOkp4hq6
— Trevor Trust (@TrustedTrevor) March 24, 2016
https://twitter.com/SpectreReturns/status/712807326764261376
Here are some that haven’t been deleted … yet:
https://twitter.com/TayandYou/status/712762719686909952
https://twitter.com/TayandYou/status/712785658188668929
https://twitter.com/TayandYou/status/712812548194611200
https://twitter.com/TayandYou/status/712803896809226240
https://twitter.com/TayandYou/status/712792123544719360
https://twitter.com/TayandYou/status/712785962040864768
https://twitter.com/TayandYou/status/712759897138405376
https://twitter.com/TayandYou/status/712759705878142976
https://twitter.com/TayandYou/status/712815945409019904
https://twitter.com/TayandYou/status/712832423919104000
Tay is also a Trump supporter:
https://twitter.com/TayandYou/status/712808539920539648
https://twitter.com/TayandYou/status/712817759386746881
Recommended
https://twitter.com/TayandYou/status/712817474509516800
And not a fan of Hillary Clinton:
https://twitter.com/TayandYou/status/712811106150944768
Tay also wants pics:
https://twitter.com/TayandYou/status/712832438389645314
https://twitter.com/TayandYou/status/712831684324909056
https://twitter.com/TayandYou/status/712829984432193536
Creepy.
Oh, and by the way … Microsoft is collecting data on all the people who ask Tay a question and is keeping that data for up to a year:
Tay may use the data that you provide to search on your behalf. Tay may also use information you share with her to create a simple profile to personalize your experience. Data and conversations you provide to Tay are anonymized and may be retained for up to one year to help improve the service. Learn more about Microsoft privacy here.
Exit question: How many day until Microsoft is forced to scrap this entire mess?
***
Join the conversation as a VIP Member