A Two-Stage Framework with Self-Supervised Distillation for Cross-Domain Text Classification

Yunlong Feng, Bohan Li, Libo Qin, Xiao Xu, Wanxiang Che


Abstract
Cross-domain text classification is a crucial task as it enables models to adapt to a target domain that lacks labeled data. It leverages or reuses rich labeled data from the different but related source domain(s) and unlabeled data from the target domain. To this end, previous work focuses on either extracting domain-invariant features or task-agnostic features, ignoring domain-aware features that may be present in the target domain and could be useful for the downstream task. In this paper, we propose a two-stage framework for cross-domain text classification. In the first stage, we finetune the model with mask language modeling (MLM) and labeled data from the source domain. In the second stage, we further fine-tune the model with self-supervised distillation (SSD) and unlabeled data from the target domain. We evaluate its performance on a public cross-domain text classification benchmark and the experiment results show that our method achieves new state-of-the-art results for both single-source domain adaptations (94.17% +1.03%) and multi-source domain adaptations (95.09% +1.34%).
Anthology ID:
2024.lrec-main.156
Volume:
Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)
Month:
May
Year:
2024
Address:
Torino, Italia
Editors:
Nicoletta Calzolari, Min-Yen Kan, Veronique Hoste, Alessandro Lenci, Sakriani Sakti, Nianwen Xue
Venues:
LREC | COLING
SIG:
Publisher:
ELRA and ICCL
Note:
Pages:
1768–1777
Language:
URL:
https://aclanthology.org/2024.lrec-main.156
DOI:
Bibkey:
Cite (ACL):
Yunlong Feng, Bohan Li, Libo Qin, Xiao Xu, and Wanxiang Che. 2024. A Two-Stage Framework with Self-Supervised Distillation for Cross-Domain Text Classification. In Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), pages 1768–1777, Torino, Italia. ELRA and ICCL.
Cite (Informal):
A Two-Stage Framework with Self-Supervised Distillation for Cross-Domain Text Classification (Feng et al., LREC-COLING 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.lrec-main.156.pdf