MCPG: A Flexible Multi-Level Controllable Framework for Unsupervised Paraphrase Generation

Yi Chen, Haiyun Jiang, Lemao Liu, Rui Wang, Shuming Shi, Ruifeng Xu


Abstract
We present MCPG: a simple and effectiveapproach for controllable unsupervised paraphrase generation, which is also flexible toadapt to specific domains without extra training. MCPG is controllable in different levels: local lexicons, global semantics, and universal styles. The unsupervised paradigm ofMCPG combines factual keywords and diversified semantic embeddings as local lexical andglobal semantic constraints. The semantic embeddings are diversified by standard dropout,which is exploited for the first time to increaseinference diversity by us. Moreover, MCPGis qualified with good domain adaptability byadding a transfer vector as a universal style constraint, which is refined from the exemplars retrieved from the corpus of the target domain in atraining-free way. Extensive experiments showthat MCPG outperforms state-of-the-art unsupervised baselines by a margin. Meanwhile,our domain-adapted MCPG also achieves competitive performance with strong supervisedbaselines even without training.
Anthology ID:
2022.findings-emnlp.439
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5948–5958
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.439
DOI:
10.18653/v1/2022.findings-emnlp.439
Bibkey:
Cite (ACL):
Yi Chen, Haiyun Jiang, Lemao Liu, Rui Wang, Shuming Shi, and Ruifeng Xu. 2022. MCPG: A Flexible Multi-Level Controllable Framework for Unsupervised Paraphrase Generation. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 5948–5958, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
MCPG: A Flexible Multi-Level Controllable Framework for Unsupervised Paraphrase Generation (Chen et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-emnlp.439.pdf