|Publication Type||Chapitre d'ouvrage|
|Année de publication||2019|
|Authors||Lücking, Andy, and Stefan Müller|
|Editor||Abeillé, Anne, Robert Borsley, and Jean-Pierre Koenig|
|Book Title||Head-Driven Phrase Structure Grammar: The handbook|
|Publisher||Language Science Press|
The received view in (psycho)linguistics, dialogue theory and gesture studies is that co-verbal gestures, i.e. hand and arm movement, are part of the utterance and contribute to its content (Kendon 1980; McNeill 1992). The relationships between gesture and speech obey regularities that need to be defined in terms of not just the relative timing of gesture to speech, but also the linguistic form of that speech: for instance, prosody and syntactic constituency and headedness (Loehr 2007; Ebert et al. 2011; Alahverdzhieva et al. 2017). Consequently, speech-gesture integration is captured in grammar by means of a gesture-grammar interface. This chapter provides basic snapshots from gesture research, reviews constraints on speech-gesture integration and summarises their implementations into HPSG frameworks. Pointers to future developments conclude the exposition. Since there are already a couple of overviews on gesture such as Özyürek (2012), Wagner et al. (2014) and Abner et al. (2015), this chapter aims at distinguishing itself by providing a guided tour of research that focuses on using (mostly) standard methods for semantic composition in constraint-based grammars like HPSG to model gesture meanings.