Etracker Debug:
	et_pagename = "Reviews in the Neurosciences|revneuro|C|[EN]"
	
        
Jump to ContentJump to Main Navigation

Reviews in the Neurosciences

Editor-in-Chief: Huston, Joseph P.

Editorial Board Member: Topic, Bianca / Adeli, Hojjat / Buzsaki, Gyorgy / Crawley, Jacqueline / Crow, Tim / Eichenbaum, Howard / Gold, Paul / Holsboer, Florian / Korth, Carsten / Lubec, Gert / McEwen, Bruce / Pan, Weihong / Pletnikov, Mikhail / Robbins, Trevor / Schnitzler, Alfons / Stevens, Charles / Steward, Oswald / Trojanowski, John

6 Issues per year

increased IMPACT FACTOR 2013: 3.314
Rank 105 out of 251 in category Neurosciences in the 2013 Thomson Reuters Journal Citation Report/Science Edition

VolumeIssuePage

Issues

Multisensory emotions: perception, combination and underlying neural processes

1, 2 / Yu-Han Chen3 / Klaus Mathiak1, 2, 4

1Department of Psychiatry, Psychotherapy and Psychosomatics, RWTH Aachen University, Pauwelsstrasse 30, 52074 Aachen, Germany

2JARA-Translational Brain Medicine, 52074 Aachen/52425 Jülich, Germany

3Center for Psychiatric Research, University of New Mexico, 1101 Yale Blvd NE, Albuquerque, NM 87106, USA

4Institute of Neuroscience and Medicine (INM-1), Forschungszentrum Jülich GmbH, 52425 Jülich, Germany

Corresponding author: Martin Klasen, Department of Psychiatry, Psychotherapy and Psychosomatics, RWTH Aachen University, Pauwelsstrasse 30, 52074 Aachen, Germany

Citation Information: . Volume 23, Issue 4, Pages 381–392, ISSN (Online) 2191-0200, ISSN (Print) 0334-1763, DOI: 10.1515/revneuro-2012-0040, June 2012

Publication History

Received:
2012-03-15
Accepted:
2012-05-02
Published Online:
2012-06-20

Abstract

In our everyday lives, we perceive emotional information via multiple sensory channels. This is particularly evident for emotional faces and voices in a social context. Over the past years, a multitude of studies have addressed the question of how affective cues conveyed by auditory and visual channels are integrated. Behavioral studies show that hearing and seeing emotional expressions can support and influence each other, a notion which is supported by investigations on the underlying neurobiology. Numerous electrophysiological and neuroimaging studies have identified brain regions subserving the integration of multimodal emotions and have provided new insights into the neural processing steps underlying the synergistic confluence of affective information from voice and face. In this paper we provide a comprehensive review covering current behavioral, electrophysiological and functional neuroimaging findings on the combination of emotions from the auditory and visual domains. Behavioral advantages arising from multimodal redundancy are paralleled by specific integration patterns on the neural level, from encoding in early sensory cortices to late cognitive evaluation in higher association areas. In summary, these findings indicate that bimodal emotions interact at multiple stages of the audiovisual integration process.

Keywords: amygdala; EEG; emotion; fMRI; MEG; multisensory integration

Comments (0)

Please log in or register to comment.
Users without a subscription are not able to see the full content. Please, subscribe or login to access all content.