Unified Language Model Pre-training for Natural Language Understanding and Generation
Last Updated on July 25, 2023 by Editorial Team
Author(s): Edward Ma
Originally published on Towards AI.
Using UNILM to tackle natural language understanding (NLU) and natural language generation (NLG)
Photo by Louis Hansel on Unsplash
Recent state-of-the-art NLP pre-trained models also use a language model to learn contextualized text representation. From ELMo (Peter et al., 2018), GPT (Radford et al., 2018) to BERT (Devlin et al., 2018), all of them use language model (LM) to achieve a better result.
Dong et al. present a new model, Unified Language Model (UNILM), to tackle natural language understanding (NLU) and natural language generation (NLG) which is trained by English Wikipedia and BookCorpus. Different from ELMo (Peter et al., 2018), GPT (Radford et al., 2018) and BERT (Devlin et al., 2018), UNILM implement unidirectional language… Read the full blog for free on Medium.
Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming aΒ sponsor.
Published via Towards AI