Jump to ContentJump to Main Navigation
Show Summary Details

Reviews in the Neurosciences

Editor-in-Chief: Huston, Joseph P.

Editorial Board Member: Topic, Bianca / Adeli, Hojjat / Buzsaki, Gyorgy / Crawley, Jacqueline / Crow, Tim / Eichenbaum, Howard / Gold, Paul / Holsboer, Florian / Korth, Carsten / Lubec, Gert / McEwen, Bruce / Pan, Weihong / Pletnikov, Mikhail / Robbins, Trevor / Schnitzler, Alfons / Stevens, Charles / Steward, Oswald / Trojanowski, John

8 Issues per year

IMPACT FACTOR 2015: 3.198
5-year IMPACT FACTOR: 3.546
Rank 97 out of 256 in category Neurosciences in the 2015 Thomson Reuters Journal Citation Report/Science Edition

SCImago Journal Rank (SJR) 2015: 1.605
Source Normalized Impact per Paper (SNIP) 2015: 0.912
Impact per Publication (IPP) 2015: 3.325

See all formats and pricing
Volume 23, Issue 4 (Aug 2012)


Multisensory emotions: perception, combination and underlying neural processes

Martin Klasen
  • Corresponding author
  • Department of Psychiatry, Psychotherapy and Psychosomatics, RWTH Aachen University, Pauwelsstrasse 30, 52074 Aachen, Germany
  • JARA-Translational Brain Medicine, 52074 Aachen/52425 Jülich, Germany
  • Email:
/ Yu-Han Chen
  • Center for Psychiatric Research, University of New Mexico, 1101 Yale Blvd NE, Albuquerque, NM 87106, USA
/ Klaus Mathiak
  • Department of Psychiatry, Psychotherapy and Psychosomatics, RWTH Aachen University, Pauwelsstrasse 30, 52074 Aachen, Germany
  • JARA-Translational Brain Medicine, 52074 Aachen/52425 Jülich, Germany
  • Institute of Neuroscience and Medicine (INM-1), Forschungszentrum Jülich GmbH, 52425 Jülich, Germany
Published Online: 2012-06-20 | DOI: https://doi.org/10.1515/revneuro-2012-0040


In our everyday lives, we perceive emotional information via multiple sensory channels. This is particularly evident for emotional faces and voices in a social context. Over the past years, a multitude of studies have addressed the question of how affective cues conveyed by auditory and visual channels are integrated. Behavioral studies show that hearing and seeing emotional expressions can support and influence each other, a notion which is supported by investigations on the underlying neurobiology. Numerous electrophysiological and neuroimaging studies have identified brain regions subserving the integration of multimodal emotions and have provided new insights into the neural processing steps underlying the synergistic confluence of affective information from voice and face. In this paper we provide a comprehensive review covering current behavioral, electrophysiological and functional neuroimaging findings on the combination of emotions from the auditory and visual domains. Behavioral advantages arising from multimodal redundancy are paralleled by specific integration patterns on the neural level, from encoding in early sensory cortices to late cognitive evaluation in higher association areas. In summary, these findings indicate that bimodal emotions interact at multiple stages of the audiovisual integration process.

Keywords: amygdala; EEG; emotion; fMRI; MEG; multisensory integration

About the article

Martin Klasen

Martin Klasen studied psychology at the University of Trier and obtained his Ph.D. in neuroscience from RWTH Aachen University, where he is currently working as a postdoctoral fellow. He is interested in the neural basis of social behaviour and communication in a multimodal context. His current research focuses on the multisensory combination of emotional cues as well as the processing of aggression-relevant information in violent video games.

Yu-Han Chen

Yu-Han Chen obtained her Ph.D. from RWTH Aachen University. She is currently a postdoctoral fellow at University of New Mexico, Department of Psychiatry, studying auditory processing abnormality in individuals with schizophrenia and bipolar disorder. Her research focuses on applying neuroimaging and electrophysiology techniques, such as magnetoencephalography (MEG), electroencephalography (EEG), and diffusion tensor imaging (DTI), to study brain structure and function.

Klaus Mathiak

Professor Klaus Mathiak, M.D., Ph.D., studied mathematics, medicine and neuroscience. He develops methods for neuroimaging and applies them in fMRI and MEG to study complex and interactive neural processes. As psychiatrist he is interested in the causation of complex social disorders in psychological diseases and attempts to develop new treatment strategies such as fMRI-based neurofeedback.

Corresponding author: Martin Klasen, Department of Psychiatry, Psychotherapy and Psychosomatics, RWTH Aachen University, Pauwelsstrasse 30, 52074 Aachen, Germany

Received: 2012-03-15

Accepted: 2012-05-02

Published Online: 2012-06-20

Published in Print: 2012-08-01

Citation Information: , ISSN (Online) 2191-0200, ISSN (Print) 0334-1763, DOI: https://doi.org/10.1515/revneuro-2012-0040. Export Citation

Citing Articles

Here you can find all Crossref-listed publications in which this article is cited. If you would like to receive automatic email messages as soon as this article is cited in other publications, simply activate the “Citation Alert” on the top of this page.

Marianna Giannitelli, Jean Xavier, Anne François, Nicolas Bodeau, Claudine Laurent, David Cohen, and Laurence Chaby
Schizophrenia Research, 2015
A. Zinchenko, P. Kanske, C. Obermeier, E. Schroger, and S. A. Kotz
Social Cognitive and Affective Neuroscience, 2015
Antje B. M. Gerdes, Matthias J. Wieser, and Georg W. Alpers
Frontiers in Psychology, 2014, Volume 5
Martin Klasen, Benjamin Kreifelts, Yu-Han Chen, Janina Seubert, and Klaus Mathiak
Frontiers in Human Neuroscience, 2014, Volume 8
J. Kokinous, S. A. Kotz, A. Tavano, and E. Schroger
Social Cognitive and Affective Neuroscience, 2015, Volume 10, Number 5, Page 713
Liliana R. Demenescu, Krystyna A. Mathiak, and Klaus Mathiak
Experimental Aging Research, 2014, Volume 40, Number 2, Page 187
Thomas Lundeberg, Iréne Lund, and Jan Näslund
Acupuncture and Related Therapies, 2012, Volume 1, Number 1, Page 10

Comments (0)

Please log in or register to comment.
Log in