At the Build 2020 developer conference, Microsoft announced that it has developed an Azure-hosted supercomputer built for testing OpenAI's large-scale artificial intelligence models.
The OpenAI supercomputer is powered by 285,000 CPU cores and 10,000 GPUs (each of which are also united by speedy 400 gigabit per second connections). To create one of the world’s fastest supercomputers Microsoft invested a $1 billion, however the total cost is not being disclosed. To be in the top five of supercomputers, a machine would currently have to reach more than 23,000 teraflops per second. Microsoft didn’t share any actual performance numbers of its computer. It has to be considered as a consequence of using a such enormous machine that the AI will become smarter. Jobs for supercomputers now are simulating nuclear weapons explosions, predicting the Earth's future climate and more recently, seeking drugs to fight the coronavirus.
Microsoft and OpenAI believe their massive computer will bring new sophistication to AI. "This type of model can so deeply absorb the nuances of language, grammar, knowledge, concepts and context that it can excel at multiple tasks: summarizing a lengthy speech, moderating content in live gaming chats, finding relevant passages across thousands of legal files or even generating code from scouring GitHub," Microsoft says. The company also declared that the end goal is to make its large AI models, training optimization tools, and supercomputing resources available through Azure AI services and GitHub for researchers.