Université Paris-Diderot – Bâtiment Olympe de Gouges – Salle 626
Roger Levy (MIT)
Surprisal, memory constraints, and the noisy channel in human sentence processing
Human language comprehension poses some of the deepest scientific challenges in accounting for the capabilities of the human mind. In this lecture I describe several major advances we have recently made in this domain that have led to a state-of-the-art theory of language comprehension. First, I describe a detailed expectation-based theory of real-time language understanding, surprisal, that unifies three topics central to the field — ambiguity resolution, prediction, and syntactic complexity — and that finds broad empirical support. I alo cover work on memory constraints that seem to influence patterns of processing difficulty in sentence comprehension, independently of surprisal. Finally, I describe a “noisy-channel” theory which generalizes the expectation-based theory by removing the assumption of modularity between the processes of individual word recognition and sentence-level comprehension. This theory accounts for critical outstanding puzzles for previous approaches, and helps move us toward a theoretical integration of surprisal and memory.