Transfer Fine-tuning for Quality Estimation of Text Simplification

Yuki Hironaka, Tomoyuki Kajiwara, Takashi Ninomiya


Abstract
To efficiently train quality estimation of text simplification on a small-scale labeled corpus, we train sentence difficulty estimation prior to fine-tuning the pre-trained language models. Our proposed method improves the quality estimation of text simplification in the framework of transfer fine-tuning, in which pre-trained language models can improve the performance of the target task by additional training on the relevant task prior to fine-tuning. Since the labeled corpus for quality estimation of text simplification is small (600 sentence pairs), an efficient training method is desired. Therefore, we propose a training method for pseudo quality estimation that does not require labels for quality estimation. As a relevant task for quality estimation of text simplification, we train the estimation of sentence difficulty. This is a binary classification task that identifies which sentence is simpler using an existing parallel corpus for text simplification. Experimental results on quality estimation of English text simplification showed that not only the quality estimation performance on simplicity that was trained, but also the quality estimation performance on fluency and meaning preservation could be improved in some cases.
Anthology ID:
2024.lrec-main.1455
Volume:
Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)
Month:
May
Year:
2024
Address:
Torino, Italia
Editors:
Nicoletta Calzolari, Min-Yen Kan, Veronique Hoste, Alessandro Lenci, Sakriani Sakti, Nianwen Xue
Venues:
LREC | COLING
SIG:
Publisher:
ELRA and ICCL
Note:
Pages:
16738–16744
Language:
URL:
https://aclanthology.org/2024.lrec-main.1455
DOI:
Bibkey:
Cite (ACL):
Yuki Hironaka, Tomoyuki Kajiwara, and Takashi Ninomiya. 2024. Transfer Fine-tuning for Quality Estimation of Text Simplification. In Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), pages 16738–16744, Torino, Italia. ELRA and ICCL.
Cite (Informal):
Transfer Fine-tuning for Quality Estimation of Text Simplification (Hironaka et al., LREC-COLING 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.lrec-main.1455.pdf