<

togog: Subproject P9

Syntactic disambiguation by gesture

Researchers: Henning Holle, Christian Obermeier, Maren Schmidt-Kassow, and Thomas C. Gunter

 

 

Aims:
To investigate the interaction of beat gestures and syntax during language comprehension.

Our study uses Event Related Potentials to address the question of whether gestures are syntactically integrated into language. Stimuli consist of spoken sentences that are syntactically ambiguous until the sentence-final auxiliary (Peter weiß, dass die Oma die Enkel gebadet hat / haben). Previous research has shown that a disambiguation towards the non-preferred object-first structure is associated with increased processing costs. For instance, the brain shows a stronger positive-going deflection in the EEG (a so-called P600) for the non-preferred object-first structures. The present study investigates whether the P600 may be reduced by attending to the beat gestures of a speaker. To this end, EEG will be measured while participants watch videos of a speaker uttering syntactically ambiguous sentences. In addition, the speaker will produce either:

1. no beat gesture;
2. a beat gesture on the first nominal phrase (e.g. Oma):
3. or a beat gesture on the second nominal phrase (e.g. Enkel).

It is hypothesized that a beat gesture can facilitate the processing of the more difficult object-first structures, which should be reflected in a reduced P600. Results indicate that generally, the processing of object-initial vs. subject-initial structures elicited a negativity in the time window from 200 – 350 ms (possibly reflecting an N400 effect), followed by a positivity in the time window from 400 – 600 ms (intepretated as a P600 effect). P600 effects were large when either no beat was produced, or when the beat occurred on the first nominal phrase of the sentence. In contrast, the P600 effect was markedly reduced when the beat occurred on the second nominal phrase, indicating that comprehension of a difficult object-initial structure can be facilitated by a beat gesture on the subject of the sentence. Thus, the results provide evidence that gesture and speech interact during comprehension on a syntactic level, extending previous findings which have already demonstrated that both domains interact on a semantic level.