A Comparison of Context-sensitive Models for Lexical Substitution

Aina Garí Soler, Anne Cocos, Marianna Apidianaki, Chris Callison-Burch

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review


Word embedding representations provide good estimates of word meaning and give state-of-the art performance in semantic tasks. Embedding approaches differ as to whether and how they account for the context surrounding a word. We present a comparison of different word and context representations on the task of proposing substitutes for a target word in context (lexical substitution). We also experiment with tuning contextualized word embeddings on a dataset of sense-specific instances for each target word. We show that powerful contextualized word representations, which give high performance in several semantics-related tasks, deal less well with the subtle in-context similarity relationships needed for substitution. This is better handled by models trained with this objective in mind, where the inter-dependence between word and context representations is explicitly modeled during training.
Original languageEnglish
Title of host publicationProceedings of the 13th International Conference on Computational Semantics - Long Papers : 23-27 May, 2019, Gothenburg, Sweden
Number of pages12
Place of PublicationStroudsburg, PA
PublisherThe Association for Computational Linguistics
Publication date1 May 2019
ISBN (Electronic)978-1-950737-19-2
Publication statusPublished - 1 May 2019
Externally publishedYes
MoE publication typeA4 Article in conference proceedings
EventInternational Conference on Computational Semantics (IWCS 2019) - University of Gothenburg, Gothenburg, Sweden
Duration: 23 May 201927 May 2019
Conference number: 13

Fields of Science

  • 6121 Languages

Cite this