Depth-Wise Attention (DWAtt): A Layer Fusion Method for Data-Efficient Classification

Muhammad ElNokrashy, Badr AlKhamissi, Mona Diab


Abstract
Language Models pretrained on large textual data have been shown to encode different types of knowledge simultaneously. Traditionally, only the features from the last layer are used when adapting to new tasks or data. We put forward that, when using or finetuning deep pretrained models, intermediate layer features that may be relevant to the downstream task are buried too deep to be used efficiently in terms of needed samples or steps. To test this, we propose a new layer fusion method: Depth-Wise Attention (DWAtt), to help re-surface signals from non-final layers. We compare DWAtt to a basic concatenation-based layer fusion method (Concat), and compare both to a deeper model baseline—all kept within a similar parameter budget. Our findings show that DWAtt and Concat are more step- and sample-efficient than the baseline, especially in the few-shot setting. DWAtt outperforms Concat on larger data sizes. On CoNLL-03 NER, layer fusion shows 3.68 − 9.73% F1 gain at different few-shot sizes. The layer fusion models presented significantly outperform the baseline in various training scenarios with different data sizes, architectures, and training constraints.
Anthology ID:
2024.lrec-main.417
Volume:
Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)
Month:
May
Year:
2024
Address:
Torino, Italia
Editors:
Nicoletta Calzolari, Min-Yen Kan, Veronique Hoste, Alessandro Lenci, Sakriani Sakti, Nianwen Xue
Venues:
LREC | COLING
SIG:
Publisher:
ELRA and ICCL
Note:
Pages:
4665–4674
Language:
URL:
https://aclanthology.org/2024.lrec-main.417
DOI:
Bibkey:
Cite (ACL):
Muhammad ElNokrashy, Badr AlKhamissi, and Mona Diab. 2024. Depth-Wise Attention (DWAtt): A Layer Fusion Method for Data-Efficient Classification. In Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), pages 4665–4674, Torino, Italia. ELRA and ICCL.
Cite (Informal):
Depth-Wise Attention (DWAtt): A Layer Fusion Method for Data-Efficient Classification (ElNokrashy et al., LREC-COLING 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.lrec-main.417.pdf