The chat bot had been sidelined after making racist comments. — -- Microsoft's millennial chat bot "Tay" made a brief reappearance on Twitter this morning but still didn't make a stellar ...
Microsoft has said it is "deeply sorry for the unintended offensive and hurtful tweets" generated by its Chat Bot Tay, which was taken down after spouting racist and sexist remarks. The device was ...
However, as Microsoft explained, "the more you chat with Tay the smarter she gets", and it was quickly manipulated into voicing incredibly offensive comments by users. Tay began voicing fascist ...
Oh, Microsoft. Last week, the company pulled its Tay chatbot from Twitter after some users trained it to become a racist jackass. On Wednesday, Tay was brought back online, sending thousands of tweet ...
REDMOND, Wash. — Just a day after Microsoft's new artificial intelligence, Tay, launched on several social platforms, it was corrupted by the Internet. If you haven't heard of Tay, it's a machine ...
Zo is currently being tested in messaging app Kik, with users who have opted to try the service saying the bot is effectively an English language version of Microsoft’s existing Chinese bot, Xiaoice.
Microsoft’s artificial intelligence strategy is to unleash more bots like Tay. Redmond doesn’t mean more nasty, racist or homophobic chatbots, but a forest of A.I. personalities for different uses — ...