Toward the Modular Training of Controlled Paraphrase Adapters

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

Abstract

Controlled paraphrase generation often focuses on a specific aspect of paraphrasing, for instance syntactically controlled paraphrase generation. However, these models face a limitation: they lack modularity. Consequently adapting them for another aspect, such as lexical variation, needs full retraining of the model each time. To enhance the flexibility in training controlled paraphrase models, our proposition involves incrementally training a modularized system for controlled paraphrase generation for English. We start by fine-tuning a pretrained language model to learn the broad task of paraphrase generation, generally emphasizing meaning preservation and surface form variation. Subsequently, we train a specialized sub-task adapter with limited sub-task specific training data. We can then leverage this adapter in guiding the paraphrase generation process toward a desired output aligning with the distinctive features within the sub-task training data. The preliminary results on comparing the fine-tuned and adapted model against various competing systems indicates that the most successful method for mastering both general paraphrasing skills and task-specific expertise follows a two-stage approach. This approach involves starting with the initial fine-tuning of a generic paraphrase model and subsequently tailoring it for the specific sub-task.

Original languageEnglish
Title of host publicationProceedings of the 1st Workshop on Modular and Open Multilingual NLP (MOOMIN 2024)
EditorsRaul Vazquez, Timothee Mickus, Jörg Tiedemann, Ivan Vulic, Ahmet Üstün
Number of pages6
Place of PublicationStroudsburg
PublisherAssociation for Computational Linguistics (ACL)
Publication dateMar 2024
Pages1-6
ISBN (Electronic)979-8-89176-084-4
Publication statusPublished - Mar 2024
MoE publication typeA4 Article in conference proceedings
EventWorkshop on Modular and Open Multilingual NLP: MOOMIN 2024 - St. Julian's, Malta
Duration: 21 Mar 202421 Mar 2024
Conference number: 1

Fields of Science

  • 6121 Languages
  • 113 Computer and information sciences

Cite this