Dynamic Task-Oriented Dialogue: A Comparative Study of Llama-2 and Bert in Slot Value Generation

Tiziano Labruna, Sofia Brenna, Bernardo Magnini


Abstract
Recent advancements in instruction-based language models have demonstrated exceptional performance across various natural language processing tasks. We present a comprehensive analysis of the performance of two open-source language models, BERT and Llama-2, in the context of dynamic task-oriented dialogues. Focusing on the Restaurant domain and utilizing the MultiWOZ 2.4 dataset, our investigation centers on the models’ ability to generate predictions for masked slot values within text. The dynamic aspect is introduced through simulated domain changes, mirroring real-world scenarios where new slot values are incrementally added to a domain over time.This study contributes to the understanding of instruction-based models’ effectiveness in dynamic natural language understanding tasks when compared to traditional language models and emphasizes the significance of open-source, reproducible models in advancing research within the academic community.
Anthology ID:
2024.eacl-srw.29
Volume:
Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics: Student Research Workshop
Month:
March
Year:
2024
Address:
St. Julian’s, Malta
Editors:
Neele Falk, Sara Papi, Mike Zhang
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
358–368
Language:
URL:
https://aclanthology.org/2024.eacl-srw.29
DOI:
Bibkey:
Cite (ACL):
Tiziano Labruna, Sofia Brenna, and Bernardo Magnini. 2024. Dynamic Task-Oriented Dialogue: A Comparative Study of Llama-2 and Bert in Slot Value Generation. In Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics: Student Research Workshop, pages 358–368, St. Julian’s, Malta. Association for Computational Linguistics.
Cite (Informal):
Dynamic Task-Oriented Dialogue: A Comparative Study of Llama-2 and Bert in Slot Value Generation (Labruna et al., EACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.eacl-srw.29.pdf
Video:
 https://aclanthology.org/2024.eacl-srw.29.mp4