A central topic in sentence comprehension research is the kinds of information and mechanisms involved in resolving temporary ambiguity regarding the syntactic structure of a sentence. Gaze patterns in scenes during spoken sentence comprehension have provided strong evidence that visual scenes trigger rapid syntactic reanalysis. However, they have also been interpreted as reflecting nonlinguistic, visual processes. Furthermore, little is known as to whether similar processes of syntactic revision are triggered by linguistic versus scene cues. To better understand how scenes influence comprehension and its time course, we recorded event-related potentials (ERPs) during the comprehension of spoken sentences that relate to depicted events. Prior electrophysiological research has observed a P600 when structural disambiguation toward a noncanonical structure occurred during reading and in the absence of scenes. We observed an ERP component with a similar latency, polarity, and distribution when depicted events disambiguated toward a noncanonical structure. The distributional similarities further suggest that scenes are on a par with linguistic contexts in triggering syntactic revision. Our findings confirm the interpretation of previous eye movement studies and highlight the benefits of combining ERP and eye-tracking measures to ascertain the neuronal processes enabled by, and the locus of attention in, visual contexts.
Research Areas and Centers
- Academic Focus: Center for Brain, Behavior and Metabolism (CBBM)