Microsoft has announced its deep learning language model Turing-NLG, and its impressive 17 billion parameters make it the largest language model to date. While it is not publicly available, a demo version has been released to a small group for testing purposes. Let’s see if more parameters mean better results, compared to OpenAI’s GPT-2 and NVIDIA’s Megatron-LM.
The AI research organization OpenAI has declared to standardize which deep learning framework to use in its projects, and the winner is PyTorch. Let’s take a closer look and see not only why OpenAI selected PyTorch, but also what benefits the standardization itself should offer.