After some hours, Tay was making sympathetic references to Hitler, criticized the American President and nominated Trump as the man to sustain the future of America. Microsoft said it’s “making adjustments” on Tay, but there was no word on when Tay might be back. Most of the messages on its Twitter account were deleted by Thursday afternoon. For the instant, the program was not prepared to deal with controversial topics. Caroline Sinders, an expert on “conversational analytics” who works on chat robots for another tech company called Tay “an example of bad design.”