Multilingual NMT with a language-independent attention bridge

Forskningsoutput: Kapitel i bok/rapport/konferenshandlingKonferensbidragVetenskapligPeer review

Sammanfattning

In this paper, we propose a multilingual encoder-decoder architecture capable of obtaining multilingual sentence representations by means of incorporating an intermediate {\em attention bridge} that is shared across all languages. That is, we train the model with language-specific encoders and decoders that are connected via self-attention with a shared layer that we call attention bridge. This layer exploits the semantics from each language for performing translation and develops into a language-independent meaning representation that can efficiently be used for transfer learning. We present a new framework for the efficient development of multilingual NMT using this model and scheduled training. We have tested the approach in a systematic way with a multi-parallel data set. We show that the model achieves substantial improvements over strong bilingual models and that it also works well for zero-shot translation, which demonstrates its ability of abstraction and transfer learning.
Originalspråkengelska
Titel på gästpublikationThe 4th Workshop on Representation Learning for NLP (RepL4NLP-2019) : Proceedings of the Workshop
RedaktörerIsabelle Augenstein, Spandana Gella, Sebastian Ruder, Katharina Kann, Burcu Can, Johannes Welbl, Alexis Conneau, Xiang Ren, Marek Rei
Antal sidor7
UtgivningsortStroudsburg
FörlagThe Association for Computational Linguistics
Utgivningsdatum2019
Sidor33-39
ISBN (elektroniskt)978-1-950737-35-2
StatusPublicerad - 2019
MoE-publikationstypA4 Artikel i en konferenspublikation
EvenemangWorkshop on Representation Learning for NLP - Florence, Italien
Varaktighet: 2 aug 20192 aug 2019
Konferensnummer: 4

Vetenskapsgrenar

  • 113 Data- och informationsvetenskap

Citera det här