About 841,000 results
Open links in new tab
  1. Tay (chatbot) - Wikipedia

  2. Microsoft shuts down AI chatbot after it turned into a Nazi - CBS News

  3. Why Microsoft's 'Tay' AI bot went wrong - TechRepublic

  4. Here are some of the tweets that got Microsoft’s AI Tay in trouble

  5. Tay: Microsoft issues apology over racist chatbot fiasco

  6. In 2016, Microsoft’s Racist Chatbot Revealed the Dangers of …

  7. Twitter taught Microsoft’s AI chatbot to be a racist asshole in …

  8. Learning from Tay’s introduction - The Official Microsoft Blog

  9. Microsoft Created a Twitter Bot to Learn From Users. It Quickly …

  10. Microsoft and the learnings from its failed Tay artificial ... - ZDNET

  11. Some results have been removed