“BERT is a system than can be tuned to do practically all tasks in NLP”
We interviewed ML Conference speaker Christoph Henkelmann in Berlin. The natural language processing expert shared some insights on Google’s model BERT, OpenAI’s recently fully released model GPT-2, and what the future may hold for NLP.
What sets Google’s natural language processing (NLP) model BERT apart from other language models? How can a custom version version be implemented and what is the so-called ImageNetMoment?
ML Conference speaker Christoph Henkelmann (DIVISIO) answered our questions regarding these topics and shared his opinion on OpenAI’s controversial choice to initially withhold the full-parameter model of GPT-2.
BERT is a system than can be tuned to do practically all tasks in NLP. It’s very versatile but also really powerful