“Twitter taught Microsoft’s AI chatbot to be a racist asshole in less than a day”
“It took less than 24 hours for Twitter to corrupt an innocent AI chatbot. Yesterday, Microsoft unveiled Tay — a Twitter bot that the company described as an experiment in “conversational understanding.”
Read: Twitter taught Microsoft’s AI chatbot to be a racist asshole in less than a day