Microsoft has announced its deep learning language model Turing-NLG, and its impressive 17 billion parameters make it the largest language model to date. While it is not publicly available, a demo version has been released to a small group for testing purposes. Let’s see if more parameters mean better results, compared to OpenAI’s GPT-2 and NVIDIA’s Megatron-LM.
ALBERT was developed by a group of research scientists at Google Research as an “upgrade to BERT.” The NLP model is designed to optimize the performance of natural language processing tasks as well as their efficiency, and now it has been made publicly available. Let’s take a closer look.
We interviewed ML Conference speaker Christoph Henkelmann in Berlin. The natural language processing expert shared some insights on Google’s model BERT, OpenAI’s recently fully released model GPT-2, and what the future may hold for NLP.