INVITED TALKS

There will be three invited talks, to be given by leading scientists who will present work and thoughts on different aspects of natural language.

The invited talks will take place in the Main conference.

The invited talks are:

From Structure to Meaning: Simple Sentence-Structure Cues Guide Sentence Comprehension by Young Children

Cynthia Fisher (Department of Psychology and the Beckman Institute of Advances Studies, University of Illinois at Urbana/Champaign, U.S.A.)

Ariel Rubinstein (School of Economics, Tel Aviv University and Department of Economic, Princeton University, U.S.A.)

Layout in NLP: The Case for Document Structure

Donia Scott (Information Technology Research Institute (ITRI) at the University of Brighton, U.K.)

From Structure to Meaning: Simple Sentence-Structure Cues Guide Sentence Comprehension by Young Children

Cynthia Fisher (Department of Psychology and the Beckman Institute of Advances Studies, University of Illinois at Urbana/Champaign, U.S.A.)
 Title: From structure to meaning: Simple sentence-structure cues guide sentence comprehension by young children Bio: Cynthia Fisher received her Ph.D. in Psychology from the University of Pennsylvania in 1989. She is an associate professor in the UIUC Department of Psychology, and a part-time faculty member in the Beckman Institute Cognitive Science Group. Her fields of professional interest are language acquisition, speech perception, and language comprehension.

Theories of language acquisition have traditionally assumed that children learn to identify syntactic structures in their native language through independent access to word and sentence meanings. If sentence meanings can be derived from world observation, then children begin with a set of form-meaning pairs that should help them to figure out the grammar. But even in the most concrete cases, sentences do not merely label events in some simple and universal way. Instead, they take a perspective on them, focusing on or highlighting different aspects of the events. How, then, is the child to determine a speaker's meaning before learning the language? In this talk I will argue that (a) the syntactic structure of sentences in which a verb is used can provide hints about its meaning, and (b) simple aspects of sentence structures influence sentence interpretation even for very young children. These simple sentence structure cues, including the set of familiar nouns in a sentence and the order in which they occur, bias children toward correct interpretations of sentences even before they know much (if anything) about the syntax of their native language. In this way, the interpretation of sentences can be structure-sensitive nearly from the start, and verb meanings are acquired as a consequence of this process of sentence interpretation, rather than being a prerequisite to it.

Ariel Rubinstein (School of Economics, Tel Aviv University, Israel and Department of Economic, Princeton University, U.S.A.)
 Title: Economics about language Bio: Ariel Rubinstein is Professor at the School of Economics at Tel Aviv University and at the Department of Economics in Princeton University. His main research fields are Game Theory and the foundations of Economic Theory. Ariel studied Mathematics and Economics at the Hebrew University and got his Phd in 1979. Since 1981 he was teaching at the Hebrew University until he moved to Tel Aviv in 1990. He visited for a term or more Nuffield College Oxford, Bell Laboratories, MRSI Berkeley, London School of Economics, the universities of Chicago, Pennsylvania, Columbia and NYU and Russell-Sage Foundation. Among the named lectures he delivered: the Walras-Bowley Lecture (1988), the CORE Lectures (1995), the Churchill Lectures (Cambridge, 1996), the Pareto Lecture (Alicante, 1996), the Zeuthen Lecture (Copenhagen, 1996), the Schwartz Lecture (Northwestern 1998) and the Schumpeter Lecture (Bolzano, 2000). Ariel is a fellow of the Economies Society since 1985 and is currently the vice-president of the society. He is a foreign honorary member of The American Academy of Arts and Sciences, a foreign honorary member of The American Economic Association, a fellow of The Israeli Academy of Sciences and was awarded the Israel prize in Economics in 2002.

I will try to demonstrate what we, economists, can say about linguistics by presenting two short investigations in which we use economic reasoning to address linguistic issues. The first discussion will be an attempt to derive properties of binary relations from considerations of functionality. The second discussion will introduce strategic considerations to explain pragmatic phenomena in debates. Reading: Ariel Rubinstein, Economics and Language, Cambridge University Press, 2000.

Layout in NLP: The Case for Document Structure

Donia Scott (Information Technology Research Institute (ITRI) at the University of Brighton, U.K.)

 Professor Donia Scott has been head of the Information Technology Research Institute at the University of Brighton (UK) since 1991. During this period she has built a research group specializing in several areas of computational linguistics, especially natural language generation (NLG), lexical representation, and corpus linguistics. Her own research has focused on multilingual NLG, and on the realization of rhetorical relationships through layout, punctuation, and discourse connectives. Earlier in her career Professor Scott worked for some years on speech and intonation, at Sussex University and Philips Research Laboratories.

This talk will present the case for {\em abstract document structure} as a separate descriptive level in the analysis and generation of written texts. The purpose of this representation is to mediate between the message of a text (i.e., its discourse structure) and its physical presentation (i.e., its organization into graphical constituents like sections, paragraphs, sentences, bulleted lists, figures, footnotes and so forth). Abstract document structure can be seen as an extension of Nunberg's text-grammar'; it is also closely related to logical' mark-up in languages like HTML and LaTeX. We will argue that by using this intermediate representation, several subtasks in language generation and language understanding can be defined more cleanly.