On the Benefits of Fine-Grained Loss Truncation: A Case Study on Factuality in Summarization

Lorenzo Jaime Flores, Arman Cohan


Abstract
Text summarization and simplification are among the most widely used applications of AI. However, such models are often prone to hallucination, which can result from training models on unaligned data. One efficient approach to address this issue is Loss Truncation (Kang and Hashimoto, 2020), an approach to modify the standard log loss to adaptively remove noisy examples during training. However, we find that LT alone yields a considerable number of hallucinated entities on various datasets. We study the behavior of the underlying losses between factual and non-factual examples, to understand and refine the performance of LT. We demonstrate that LT’s performance is limited when the underlying assumption that noisy targets have higher NLL loss is not satisfied, and find that word-level NLL among entities provides better signal for distinguishing factuality. We then leverage this to propose a fine-grained NLL loss and fine-grained data cleaning strategies, and observe improvements in hallucination reduction across some datasets. Our work is available at https://github.com/yale-nlp/Simplification-Projects.
Anthology ID:
2024.eacl-short.13
Volume:
Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
March
Year:
2024
Address:
St. Julian’s, Malta
Editors:
Yvette Graham, Matthew Purver
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
138–150
Language:
URL:
https://aclanthology.org/2024.eacl-short.13
DOI:
Bibkey:
Cite (ACL):
Lorenzo Jaime Flores and Arman Cohan. 2024. On the Benefits of Fine-Grained Loss Truncation: A Case Study on Factuality in Summarization. In Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 2: Short Papers), pages 138–150, St. Julian’s, Malta. Association for Computational Linguistics.
Cite (Informal):
On the Benefits of Fine-Grained Loss Truncation: A Case Study on Factuality in Summarization (Flores & Cohan, EACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.eacl-short.13.pdf
Software:
 2024.eacl-short.13.software.zip
Note:
 2024.eacl-short.13.note.zip
Video:
 https://aclanthology.org/2024.eacl-short.13.mp4