Latest Development in the FoTran Project – Scaling Up Language Coverage in Neural Machine Translation Using Distributed Training with Language-Specific Components

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

Abstract

We give an update on the Found in Translation (FoTran) project, focusing on the study of emerging language-agnostic representations from neural machine translation (NMT). We describe our attention-bridge model, a modular NMT model which connects language-specific components through a shared network layer. Our latest implementation supports distributed training over many nodes and GPUs in order to substantially scale up the number of languages that can be included in a modern neural translation architecture.
Original languageEnglish
Title of host publicationProceedings of the 23rd Annual Conference of the European Association for Machine Translation
EditorsHelana Moniz, Lieve Macken, Andrew Rufener, et al.
Number of pages2
Place of PublicationGeneva
PublisherEuropean Association for Machine Translation
Publication date2022
Pages311-312
ISBN (Electronic)9789464597622
Publication statusPublished - 2022
MoE publication typeA4 Article in conference proceedings
EventAnnual Conference of the European Association for Machine Translation - Ghent, Belgium
Duration: 1 Jun 20223 Jun 2022
Conference number: 23
https://eamt2022.com

Fields of Science

  • 6121 Languages
  • 113 Computer and information sciences

Cite this