Abstract
Controlled paraphrase generation often focuses on a specific aspect of paraphrasing, for instance syntactically controlled paraphrase generation. However, these models face a limitation: they lack modularity. Consequently adapting them for another aspect, such as lexical variation, needs full retraining of the model each time. To enhance the flexibility in training controlled paraphrase models, our proposition involves incrementally training a modularized system for controlled paraphrase generation for English. We start by fine-tuning a pretrained language model to learn the broad task of paraphrase generation, generally emphasizing meaning preservation and surface form variation. Subsequently, we train a specialized sub-task adapter with limited sub-task specific training data. We can then leverage this adapter in guiding the paraphrase generation process toward a desired output aligning with the distinctive features within the sub-task training data. The preliminary results on comparing the fine-tuned and adapted model against various competing systems indicates that the most successful method for mastering both general paraphrasing skills and task-specific expertise follows a two-stage approach. This approach involves starting with the initial fine-tuning of a generic paraphrase model and subsequently tailoring it for the specific sub-task.
Original language | English |
---|---|
Title of host publication | Proceedings of the 1st Workshop on Modular and Open Multilingual NLP (MOOMIN 2024) |
Editors | Raul Vazquez, Timothee Mickus, Jörg Tiedemann, Ivan Vulic, Ahmet Üstün |
Number of pages | 6 |
Place of Publication | Stroudsburg |
Publisher | Association for Computational Linguistics (ACL) |
Publication date | Mar 2024 |
Pages | 1-6 |
ISBN (Electronic) | 979-8-89176-084-4 |
Publication status | Published - Mar 2024 |
MoE publication type | A4 Article in conference proceedings |
Event | Workshop on Modular and Open Multilingual NLP: MOOMIN 2024 - St. Julian's, Malta Duration: 21 Mar 2024 → 21 Mar 2024 Conference number: 1 |
Fields of Science
- 6121 Languages
- 113 Computer and information sciences