BERT models in Danish, Swedish and Norwegian have been released by the Danish company BotXO. We spoke to Jens Dahl Møllerhøj, Lead Data Scientist at BotXO, to find out more. See how these open source models differ from Google’s multilanguage BERT model, what can make creating NLP models for Nordic languages difficult, and where these models can be used.
The most recent addition to Facebook AI’s open source projects is Blender, a state-of-the-art chatbot. What sets it apart from other chatbots is its novel blending of skills, including empathy and personality. Let’s take a closer look.
Microsoft has announced its deep learning language model Turing-NLG, and its impressive 17 billion parameters make it the largest language model to date. While it is not publicly available, a demo version has been released to a small group for testing purposes. Let’s see if more parameters mean better results, compared to OpenAI’s GPT-2 and NVIDIA’s Megatron-LM.
ALBERT was developed by a group of research scientists at Google Research as an “upgrade to BERT.” The NLP model is designed to optimize the performance of natural language processing tasks as well as their efficiency, and now it has been made publicly available. Let’s take a closer look.
We interviewed ML Conference speaker Christoph Henkelmann in Berlin. The natural language processing expert shared some insights on Google’s model BERT, OpenAI’s recently fully released model GPT-2, and what the future may hold for NLP.