Constraint Grammar is a hand-crafted Transformer

Forskningsoutput: Kapitel i bok/rapport/konferenshandlingKonferensbidragVetenskapligPeer review


Deep neural networks (DNN) and linguistic rules are currently the opposite ends in the scale for NLP technologies. Until recently, it has not been known how to combine these technologies most effectively. Therefore, the technologies have been the object of almost disjoint research communities. In this presentation, I first recall that both Constraint Grammar (CG) and vanilla RNNs have finite-state properties. Then I relate CG to Google’s Transformer architecture (with two kinds of attention) and argue that there are significant similarities between these two seemingly unrelated architectures.
Titel på värdpublikation Proceedings of the NoDaLiDa 2019 Workshop on Constraint Grammar - Methods, Tools and Applications, 30 September 2019, Turku, Finland
RedaktörerEckhard Bick, Trond Trosterud
Antal sidor5
FörlagLinköping University Electronic Press
Utgivningsdatum3 dec. 2019
ISBN (elektroniskt)978-91-7929-918-7
StatusPublicerad - 3 dec. 2019
MoE-publikationstypA4 Artikel i en konferenspublikation
EvenemangNoDaLiDa 2019 workshop on Constraint Grammar - Methods, Tools, and Applications - University of Turku, Turku, Finland
Varaktighet: 30 sep. 201930 sep. 2019


NamnNEALT Proceedings Series
FörlagLinköping University Electronic Press, Linköpings universitet
NamnLinköping Electronic Conference Proceedings
FörlagLinköping University Electronic Press, Linköpings universitet
ISSN (tryckt)1650-3686
ISSN (elektroniskt)1650-3740


  • 113 Data- och informationsvetenskap

Citera det här