NutFrame: Frame-based Conceptual Structure Induction with LLMs

Shaoru Guo, Yubo Chen, Kang Liu, Ru Li, Jun Zhao


Abstract
Conceptual structure is fundamental to human cognition and natural language understanding. It is significant to explore whether Large Language Models (LLMs) understand such knowledge. Since FrameNet serves as a well-defined conceptual structure knowledge resource, with meaningful frames, fine-grained frame elements, and rich frame relations, we construct a benchmark for coNceptual structure induction based on FrameNet, called NutFrame. It contains three sub-tasks: Frame Induction, Frame Element Induction, and Frame Relation Induction. In addition, we utilize prompts to induce conceptual structure of Framenet with LLMs. Furthermore, we conduct extensive experiments on NutFrame to evaluate various widely-used LLMs. Experimental results demonstrate that FrameNet induction remains a challenge for LLMs.
Anthology ID:
2024.lrec-main.1079
Volume:
Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)
Month:
May
Year:
2024
Address:
Torino, Italia
Editors:
Nicoletta Calzolari, Min-Yen Kan, Veronique Hoste, Alessandro Lenci, Sakriani Sakti, Nianwen Xue
Venues:
LREC | COLING
SIG:
Publisher:
ELRA and ICCL
Note:
Pages:
12330–12335
Language:
URL:
https://aclanthology.org/2024.lrec-main.1079
DOI:
Bibkey:
Cite (ACL):
Shaoru Guo, Yubo Chen, Kang Liu, Ru Li, and Jun Zhao. 2024. NutFrame: Frame-based Conceptual Structure Induction with LLMs. In Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), pages 12330–12335, Torino, Italia. ELRA and ICCL.
Cite (Informal):
NutFrame: Frame-based Conceptual Structure Induction with LLMs (Guo et al., LREC-COLING 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.lrec-main.1079.pdf