Translate to Disambiguate: Zero-shot Multilingual Word Sense Disambiguation with Pretrained Language Models

Haoqiang Kang, Terra Blevins, Luke Zettlemoyer


Abstract
Pretrained Language Models (PLMs) learn rich cross-lingual knowledge and perform well on diverse tasks such as translation and multilingual word sense disambiguation (WSD) when finetuned. However, they often struggle at disambiguating word sense in a zero-shot setting. To better understand this contrast, we present a new study investigating how well PLMs capture cross-lingual word sense with Contextual Word-Level Translation (C-WLT), an extension of word-level translation that prompts the model to translate a given word in context. We find that as the model size increases, PLMs encode more cross-lingual word sense knowledge and better use context to improve WLT performance. Building on C-WLT, we introduce a zero-shot prompting approach for WSD, tested on 18 languages from the XL-WSD dataset. Our method outperforms fully supervised baselines on recall for many evaluation languages without additional training or finetuning. This study presents a first step towards understanding how to best leverage the cross-lingual knowledge inside PLMs for robust zero-shot reasoning in any language.
Anthology ID:
2024.eacl-long.94
Volume:
Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
March
Year:
2024
Address:
St. Julian’s, Malta
Editors:
Yvette Graham, Matthew Purver
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1562–1575
Language:
URL:
https://aclanthology.org/2024.eacl-long.94
DOI:
Bibkey:
Cite (ACL):
Haoqiang Kang, Terra Blevins, and Luke Zettlemoyer. 2024. Translate to Disambiguate: Zero-shot Multilingual Word Sense Disambiguation with Pretrained Language Models. In Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1562–1575, St. Julian’s, Malta. Association for Computational Linguistics.
Cite (Informal):
Translate to Disambiguate: Zero-shot Multilingual Word Sense Disambiguation with Pretrained Language Models (Kang et al., EACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.eacl-long.94.pdf
Video:
 https://aclanthology.org/2024.eacl-long.94.mp4