Latest Development in the FoTran Project – Scaling Up Language Coverage in Neural Machine Translation Using Distributed Training with Language-Specific Components

Forskningsoutput: Kapitel i bok/rapport/konferenshandlingKonferensbidragVetenskapligPeer review

Sammanfattning

We give an update on the Found in Translation (FoTran) project, focusing on the study of emerging language-agnostic representations from neural machine translation (NMT). We describe our attention-bridge model, a modular NMT model which connects language-specific components through a shared network layer. Our latest implementation supports distributed training over many nodes and GPUs in order to substantially scale up the number of languages that can be included in a modern neural translation architecture.
Originalspråkengelska
Titel på värdpublikationProceedings of the 23rd Annual Conference of the European Association for Machine Translation
RedaktörerHelana Moniz, Lieve Macken, Andrew Rufener, et al.
Antal sidor2
UtgivningsortGeneva
FörlagEuropean Association for Machine Translation
Utgivningsdatum2022
Sidor311-312
ISBN (elektroniskt)9789464597622
StatusPublicerad - 2022
MoE-publikationstypA4 Artikel i en konferenspublikation
EvenemangAnnual Conference of the European Association for Machine Translation - Ghent, Belgien
Varaktighet: 1 juni 20223 juni 2022
Konferensnummer: 23
https://eamt2022.com

Vetenskapsgrenar

  • 6121 Språkvetenskaper
  • 113 Data- och informationsvetenskap

Citera det här