Natural language generation with 17 billion parameters

Microsoft’s deep learning language model Turing-NLG outperforms GPT-2

Maika Möbus
© Shutterstock / Nelson Marques

Microsoft has announced its deep learning language model Turing-NLG, and its impressive 17 billion parameters make it the largest language model to date. While it is not publicly available, a demo version has been released to a small group for testing purposes. Let’s see if more parameters mean better results, compared to OpenAI’s GPT-2 and NVIDIA’s Megatron-LM.

Turing Natural Language Generation, short Turing-NLG, is a new deep learning language model developed by Microsoft. It was announced on the Microsoft Research Blog.

SEE ALSO: OpenAI finally releases “dangerous” large-scale unsupervised language model GPT-2

Turing-NLG’s features

The transformer-based model Turing-NLG can not only generate texts and summarize documents, but also respond to questions. Microsoft views this as an important task in NLP and sees it as its goal “to respond as directly, accurately, and fluently as humans can in any situation.”

In the release post, there’s even a summary created by the model itself:

Turing Natural Language Generation (T-NLG) is a 17 billion parameter language model by Microsoft that outperforms the state of the art on many downstream NLP tasks. We present a demo of the model, including its freeform generation, question answering, and summarization capabilities, to academics for feedback and research purposes. <|endoftext|>

Comparison with other language models

With 17 billion parameters, Microsoft claims T-NLG to be the largest language model ever published.

In this graphic, Microsoft demonstrates the number of parameters compared to other language models like OpenAI’s full 1.5 billion model of GPT-2, which caused a stir last year and was released in November, and Google’s ground-breaking model BERT:

The performance of the pretrained T-NLG model was put to the test on the standard language tasks LAMBADA, where higher results are better, and WikiText-103, where lower results are better.

Here you can see how Turing-NLG outperformed both OpenAI’s full 1.5 billion model of GPT-2 as well as NVIDIA’s Megatron-LM, which has 8.3 billion parameters:

Demo release

So far, Turing-NLG has not been publicly released, and Microsoft has not announced whether this is planned. The company does, however, state that work created in Project Turing, which Turing-NLG is a part of, is “being integrated into multiple Microsoft products including Bing, Office, and Xbox.”

SEE ALSO: “BERT is a system than can be tuned to do practically all tasks in NLP”

See the Microsoft Research Blog for further information on Turing-NLG.

Maika Möbus
Maika Möbus has been an editor for Software & Support Media since January 2019. She studied Sociology at Goethe University Frankfurt and Johannes Gutenberg University Mainz.

Inline Feedbacks
View all comments