Is Modularity Transferable? A Case Study through the Lens of Knowledge Distillation

Mateusz Klimaszewski, Piotr Andruszkiewicz, Alexandra Birch


Abstract
The rise of Modular Deep Learning showcases its potential in various Natural Language Processing applications. Parameter-efficient fine-tuning (PEFT) modularity has been shown to work for various use cases, from domain adaptation to multilingual setups. However, all this work covers the case where the modular components are trained and deployed within one single Pre-trained Language Model (PLM). This model-specific setup is a substantial limitation on the very modularity that modular architectures are trying to achieve. We ask whether current modular approaches are transferable between models and whether we can transfer the modules from more robust and larger PLMs to smaller ones. In this work, we aim to fill this gap via a lens of Knowledge Distillation, commonly used for model compression, and present an extremely straightforward approach to transferring pre-trained, task-specific PEFT modules between same-family PLMs. Moreover, we propose a method that allows the transfer of modules between incompatible PLMs without any change in the inference complexity. The experiments on Named Entity Recognition, Natural Language Inference, and Paraphrase Identification tasks over multiple languages and PEFT methods showcase the initial potential of transferable modularity.
Anthology ID:
2024.lrec-main.817
Volume:
Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)
Month:
May
Year:
2024
Address:
Torino, Italia
Editors:
Nicoletta Calzolari, Min-Yen Kan, Veronique Hoste, Alessandro Lenci, Sakriani Sakti, Nianwen Xue
Venues:
LREC | COLING
SIG:
Publisher:
ELRA and ICCL
Note:
Pages:
9352–9360
Language:
URL:
https://aclanthology.org/2024.lrec-main.817
DOI:
Bibkey:
Cite (ACL):
Mateusz Klimaszewski, Piotr Andruszkiewicz, and Alexandra Birch. 2024. Is Modularity Transferable? A Case Study through the Lens of Knowledge Distillation. In Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), pages 9352–9360, Torino, Italia. ELRA and ICCL.
Cite (Informal):
Is Modularity Transferable? A Case Study through the Lens of Knowledge Distillation (Klimaszewski et al., LREC-COLING 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.lrec-main.817.pdf