Master LLMs with our FREE course in collaboration with Activeloop & Intel Disruptor Initiative. Join now!

Publication

Cross-lingual Language Model
Latest   Machine Learning

Cross-lingual Language Model

Last Updated on July 25, 2023 by Editorial Team

Author(s): Edward Ma

Originally published on Towards AI.

Discussing XLMs and unsupervised cross-lingual word embedding by multilingual neural language models


Photo by Edward Ma on Unsplash

A pre-trained model is proven to improve the downstream problem. Lample and Conneau propose two new training objectives to train cross-lingual language models (XLM). This approach leads to achieving state-of-the-art results on Cross-lingual Natural Language Inference (XNLI). On the other hand, Wada and Iwata proposed another way to learn cross-lingual text representation without parallel data. They named it Multilingual Neural Language Models.

This story will discuss Pretraining (Lample and Conneau, 2019) and Unsupervised Cross-lingual Word Embedding by Multilingual Neural Language Models (Wada and Iwata, 2018)

The following are will be covered:

Data ArchitectureMultilingual Neural Language Models ArchitectureExperiment

Lample… Read the full blog for free on Medium.

Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming a sponsor.

Published via Towards AI

Feedback ↓