Constraint Grammar is a hand-crafted Transformer

Forskningsoutput: Kapitel i bok/rapport/konferenshandlingKonferensbidragVetenskapligPeer review

Sammanfattning

Deep neural networks (DNN) and linguistic rules are currently the opposite ends in the scale for NLP technologies. Until recently, it has not been known how to combine these technologies most effectively. Therefore, the technologies have been the object of almost disjoint research communities. In this presentation, I first recall that both Constraint Grammar (CG) and vanilla RNNs have finite-state properties. Then I relate CG to Google’s Transformer architecture (with two kinds of attention) and argue that there are significant similarities between these two seemingly unrelated architectures.
Originalspråkengelska
Titel på värdpublikation Proceedings of the NoDaLiDa 2019 Workshop on Constraint Grammar - Methods, Tools and Applications, 30 September 2019, Turku, Finland
RedaktörerEckhard Bick, Trond Trosterud
Antal sidor5
UtgivningsortLinköping
FörlagLinköping University Electronic Press
Utgivningsdatum3 dec. 2019
Sidor45-49
Artikelnummer9
ISBN (elektroniskt)978-91-7929-918-7
StatusPublicerad - 3 dec. 2019
MoE-publikationstypA4 Artikel i en konferenspublikation
EvenemangNoDaLiDa 2019 workshop on Constraint Grammar - Methods, Tools, and Applications - University of Turku, Turku, Finland
Varaktighet: 30 sep. 201930 sep. 2019
https://visl.sdu.dk/nodalida2019.html

Publikationsserier

NamnNEALT Proceedings Series
FörlagLinköping University Electronic Press, Linköpings universitet
Nummer33
NamnLinköping Electronic Conference Proceedings
FörlagLinköping University Electronic Press, Linköpings universitet
Nummer168
ISSN (tryckt)1650-3686
ISSN (elektroniskt)1650-3740

Vetenskapsgrenar

  • 113 Data- och informationsvetenskap

Citera det här