Jump to ContentJump to Main Navigation
Show Summary Details

Reviews in the Neurosciences

Editor-in-Chief: Huston, Joseph P.

Editorial Board Member: Topic, Bianca / Adeli, Hojjat / Buzsaki, Gyorgy / Crawley, Jacqueline / Crow, Tim / Eichenbaum, Howard / Gold, Paul / Holsboer, Florian / Korth, Carsten / Lubec, Gert / McEwen, Bruce / Pan, Weihong / Pletnikov, Mikhail / Robbins, Trevor / Schnitzler, Alfons / Stevens, Charles / Steward, Oswald / Trojanowski, John


IMPACT FACTOR 2015: 3.198
5-year IMPACT FACTOR: 3.546
Rank 97 out of 256 in category Neurosciences in the 2015 Thomson Reuters Journal Citation Report/Science Edition

SCImago Journal Rank (SJR) 2015: 1.605
Source Normalized Impact per Paper (SNIP) 2015: 0.912
Impact per Publication (IPP) 2015: 3.325

Online
ISSN
2191-0200
See all formats and pricing

 


Select Volume and Issue

Issues

30,00 € / $42.00 / £23.00

Get Access to Full Text

Representation of temporal sound features in the human auditory cortex

1 / John F. Brugge1

1Human Brain Research Laboratory, Department of Neurosurgery, The University of Iowa, 200 Hawkins Dr., Iowa City, IA 52242, USA

Corresponding author

Citation Information: Reviews in the Neurosciences. Volume 22, Issue 2, Pages 187–203, ISSN (Online) 2191-0200, ISSN (Print) 0334-1763, DOI: https://doi.org/10.1515/rns.2011.016, April 2011

Publication History

Published Online:
2011-04-12

Abstract

Temporal information in acoustic signals is important for the perception of environmental sounds, including speech. This review focuses on several aspects of temporal processing within human auditory cortex and its relevance for the processing of speech sounds. Periodic non-speech sounds, such as trains of acoustic clicks and bursts of amplitude-modulated noise or tones, can elicit different percepts depending on the pulse repetition rate or modulation frequency. Such sounds provide convenient methodological tools to study representation of timing information in the auditory system. At low repetition rates of up to 8–10 Hz, each individual stimulus (a single click or a sinusoidal amplitude modulation cycle) within the sequence is perceived as a separate event. As repetition rates increase up to and above approximately 40 Hz, these events blend together, giving rise first to the percept of flutter and then to pitch. The extent to which neural responses of human auditory cortex encode temporal features of acoustic stimuli is discussed within the context of these perceptual classes of periodic stimuli and their relationship to speech sounds. Evidence for neural coding of temporal information at the level of the core auditory cortex in humans suggests possible physiological counterparts to perceptual categorical boundaries for periodic acoustic stimuli. Temporal coding is less evident in auditory cortical fields beyond the core. Finally, data suggest hemispheric asymmetry in temporal cortical processing.

Keywords: amplitude modulation; Heschl’s gyrus; intracranial recording; phase locking; speech; superior temporal gyrus; temporal envelope

Citing Articles

Here you can find all Crossref-listed publications in which this article is cited. If you would like to receive automatic email messages as soon as this article is cited in other publications, simply activate the “Citation Alert” on the top of this page.

[1]
Liang Cheng, Shao-Hui Wang, Yun Huang, and Xiao-Mei Liao
Hearing Research, 2016, Volume 333, Page 93
[2]
Huizhen Tang, Stephen Crain, and Blake W. Johnson
NeuroImage, 2016, Volume 128, Page 32
[3]
Kirill V. Nourski, Christine P. Etler, John F. Brugge, Hiroyuki Oya, Hiroto Kawasaki, Richard A. Reale, Paul J. Abbas, Carolyn J. Brown, and Matthew A. Howard
Journal of the Association for Research in Otolaryngology, 2013, Volume 14, Number 3, Page 435
[4]
Zhuxi Yao, Jianhui Wu, Bin Zhou, Kan Zhang, and Liang Zhang
Frontiers in Psychology, 2015, Volume 6
[5]
Mitchell Steinschneider, Kirill V. Nourski, and Yonatan I. Fishman
Hearing Research, 2013, Volume 305, Page 57
[6]
Kirill V. Nourski, Mitchell Steinschneider, Ariane E. Rhone, Hiroyuki Oya, Hiroto Kawasaki, Matthew A. Howard, and Bob McMurray
Brain and Language, 2015
[7]
[8]
S. Nozaradan
Philosophical Transactions of the Royal Society B: Biological Sciences, 2014, Volume 369, Number 1658, Page 20130393
[9]
Mitchell Steinschneider, Kirill V. Nourski, Ariane E. Rhone, Hiroto Kawasaki, Hiroyuki Oya, and Matthew A. Howard
Frontiers in Neuroscience, 2014, Volume 8
[10]
Anke Ley, Jean Vroomen, and Elia Formisano
Frontiers in Neuroscience, 2014, Volume 8
[12]
Stephen R. Arnott, Lore Thaler, Jennifer L. Milne, Daniel Kish, and Melvyn A. Goodale
Neuropsychologia, 2013, Volume 51, Number 5, Page 938
[13]
Sumru Keceli, Koji Inui, Hidehiko Okamoto, Naofumi Otsuru, and Ryusuke Kakigi
BMC Neuroscience, 2012, Volume 13, Number 1, Page 7

Comments (0)

Please log in or register to comment.