Barend Beekhuizen


2024

pdf bib
Using Language Models to Unravel Semantic Development in Children’s Use of Perception Verbs
Bram van Dijk | Max J. van Duijn | Li Kloostra | Marco Spruit | Barend Beekhuizen
Proceedings of the Workshop on Cognitive Aspects of the Lexicon @ LREC-COLING 2024

In this short paper we employ a Language Model (LM) to gain insight into how complex semantics of a Perception Verb (PV) emerge in children. Using a Dutch LM as representation of mature language use, we find that for all ages 1) the LM accurately predicts PV use in children’s freely-told narratives; 2) children’s PV use is close to mature use; 3) complex PV meanings with attentional and cognitive aspects can be found. Our approach illustrates how LMs can be meaningfully employed in studying language development, hence takes a constructive position in the debate on the relevance of LMs in this context.

2023

pdf bib
What social attitudes about gender does BERT encode? Leveraging insights from psycholinguistics
Julia Watson | Barend Beekhuizen | Suzanne Stevenson
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)

Much research has sought to evaluate the degree to which large language models reflect social biases. We complement such work with an approach to elucidating the connections between language model predictions and people’s social attitudes. We show how word preferences in a large language model reflect social attitudes about gender, using two datasets from human experiments that found differences in gendered or gender neutral word choices by participants with differing views on gender (progressive, moderate, or conservative). We find that the language model BERT takes into account factors that shape human lexical choice of such language, but may not weigh those factors in the same way people do. Moreover, we show that BERT’s predictions most resemble responses from participants with moderate to conservative views on gender. Such findings illuminate how a language model: (1) may differ from people in how it deploys words that signal gender, and (2) may prioritize some social attitudes over others.

2022

pdf bib
Remodelling complement coercion interpretation
Frederick Gietz | Barend Beekhuizen
Proceedings of the Society for Computation in Linguistics 2022

2021

pdf bib
A Formidable Ability: Detecting Adjectival Extremeness with DSMs
Farhan Samir | Barend Beekhuizen | Suzanne Stevenson
Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021

2019

pdf bib
Say Anything: Automatic Semantic Infelicity Detection in L2 English Indefinite Pronouns
Ella Rabinovich | Julia Watson | Barend Beekhuizen | Suzanne Stevenson
Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL)

Computational research on error detection in second language speakers has mainly addressed clear grammatical anomalies typical to learners at the beginner-to-intermediate level. We focus instead on acquisition of subtle semantic nuances of English indefinite pronouns by non-native speakers at varying levels of proficiency. We first lay out theoretical, linguistically motivated hypotheses, and supporting empirical evidence, on the nature of the challenges posed by indefinite pronouns to English learners. We then suggest and evaluate an automatic approach for detection of atypical usage patterns, demonstrating that deep learning architectures are promising for this task involving nuanced semantic anomalies.

2015

pdf bib
Perceptual, conceptual, and frequency effects on error patterns in English color term acquisition
Barend Beekhuizen | Suzanne Stevenson
Proceedings of the Sixth Workshop on Cognitive Aspects of Computational Language Learning

2014

pdf bib
A Usage-Based Model of Early Grammatical Development
Barend Beekhuizen | Rens Bod | Afsaneh Fazly | Suzanne Stevenson | Arie Verhagen
Proceedings of the Fifth Workshop on Cognitive Modeling and Computational Linguistics