Abstract
Evaluation of emotional scenes requires integration of information from different modality channels, most frequently from audition and vision. Neither the psychological nor neural basis of auditory-visual interactions during the processing of affect is well understood. In this study, possible interactions in affective processing were investigated via event-related potential (ERP) recordings during simultaneous presentation of affective pictures (from IAPS) and affectively sung notes that either matched or mismatched each other in valence. To examine the role of attention in multisensory affect-integration ERPs were recorded in two different rating tasks (voice affect rating, picture affect rating) as participants evaluated the affect communicated in one of the modalities, while that in the other modality was ignored. Both the behavioral and ERP data revealed some, although non-identical, patterns of cross-modal influences; modulation of the ERP-component P2 suggested a relatively early integration of affective information in the attended picture condition, though only for happy picture-voice pairs. In addition, congruent pairing of sad pictures and sad voice stimuli affected the late positive potential (LPP). Responses in the voice affect rating task were overall more likely to be modulated by the concomitant picture's affective valence than vice versa.
Original language | English |
---|---|
Journal | Brain Research |
Volume | 1070 |
Issue number | 1 |
Pages (from-to) | 160-170 |
Number of pages | 11 |
ISSN | 0006-8993 |
DOIs | |
Publication status | Published - 27.01.2006 |
Research Areas and Centers
- Academic Focus: Center for Brain, Behavior and Metabolism (CBBM)