Finnish ASR with Deep Transformer Models

Abhilash Jain, Aku Rouhe, Stig-Arne Grönroos, Mikko Kurimo

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

Abstract

Recently, BERT and Transformer-XL based architectures have achieved strong results in a range of NLP applications. In this paper, we explore Transformer architectures—BERT and
Transformer-XL—as a language model for a Finnish ASR task with different rescoring schemes. We achieve strong results in both an intrinsic and an extrinsic task with Transformer-XL. Achieving 29% better perplexity and 3% better WER than our previous best LSTM-based approach. We also introduce a novel three-pass decoding scheme
which improves the ASR performance by 8%. To the best of our knowledge, this is also the first work (i) to formulate an alpha smoothing framework to use the non-autoregressive BERT language model for an ASR task, and (ii) to explore sub-word units with Transformer-XL for an agglutinative language like Finnish.
Original languageEnglish
Title of host publicationProceedings of Interspeech 2020
Number of pages5
Place of PublicationBaixas
PublisherISCA - International Speech Communication Association
Publication date2020
Pages3630-3634
DOIs
Publication statusPublished - 2020
Externally publishedYes
MoE publication typeA4 Article in conference proceedings
EventInterspeech 2020 - [Virtual conference]
Duration: 25 Oct 202029 Oct 2020
http://www.interspeech2020.org

Publication series

NameInterspeech
PublisherISCA
ISSN (Electronic)2308-457X

Fields of Science

  • 6121 Languages
  • 113 Computer and information sciences

Cite this