What subcortical-cortical relationships tell us about processing speech in noise

被引:72
|
作者
Parbery-Clark, Alexandra [1 ,2 ]
Marmel, Frederic [1 ]
Bair, Julia [3 ]
Kraus, Nina [1 ,2 ,4 ,5 ]
机构
[1] Auditory Neurosci Lab, Evanston, IL 60208 USA
[2] Northwestern Univ, Dept Commun Sci, Evanston, IL USA
[3] Oberlin Coll, Dept Neurosci, Oberlin, OH 44074 USA
[4] Northwestern Univ, Dept Neurobiol & Physiol, Evanston, IL 60208 USA
[5] Northwestern Univ, Dept Otolaryngol, Evanston, IL 60208 USA
关键词
auditory brainstem response; cortical auditory-evoked potentials; N1; speech in noise; AUDITORY BRAIN-STEM; EVOKED NEUROMAGNETIC FIELDS; BROAD-BAND NOISE; IN-NOISE; RECEPTIVE-FIELDS; OLIVOCOCHLEAR REFLEX; STOP CONSONANTS; MASKED TONES; PERCEPTION; POTENTIALS;
D O I
10.1111/j.1460-9568.2010.07546.x
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
To advance our understanding of the biological basis of speech-in-noise perception, we investigated the effects of background noise on both subcortical- and cortical-evoked responses, and the relationships between them, in normal hearing young adults. The addition of background noise modulated subcortical and cortical response morphology. In noise, subcortical responses were later, smaller in amplitude and demonstrated decreased neural precision in encoding the speech sound. Cortical responses were also delayed by noise, yet the amplitudes of the major peaks (N1, P2) were affected differently, with N1 increasing and P2 decreasing. Relationships between neural measures and speech-in-noise ability were identified, with earlier subcortical responses, higher subcortical response fidelity and greater cortical N1 response magnitude all relating to better speech-in-noise perception. Furthermore, it was only with the addition of background noise that relationships between subcortical and cortical encoding of speech and the behavioral measures of speech in noise emerged. Results illustrate that human brainstem responses and N1 cortical response amplitude reflect coordinated processes with regards to the perception of speech in noise, thereby acting as a functional index of speech-in-noise perception.
引用
收藏
页码:549 / 557
页数:9
相关论文
共 47 条
  • [31] What: infant-directed speech tells us about the development of compensation for assimilation
    Buckler, Helen
    Goy, Huiwen
    Johnson, Elizabeth K.
    JOURNAL OF PHONETICS, 2018, 66 : 45 - 62
  • [32] REPLICABILITY AND MODELS OF PRIMING: WHAT A RESOURCE COMPUTATION FRAMEWORK CAN TELL US ABOUT EXPECTATIONS OF REPLICABILITY
    Cesario, Joseph
    Jonas, Kai J.
    SOCIAL COGNITION, 2014, 32 : 124 - 136
  • [33] Effects of multisensory simultaneity judgment training on the comprehension and cortical processing of speech in noise: a randomized controlled trial
    Ansley J. Kunnath
    Hannah S. Bertisch
    Andrew S. Kim
    René H. Gifford
    Mark T. Wallace
    Scientific Reports, 15 (1)
  • [34] Face inversion superiority in a case of prosopagnosia following congenital brain abnormalities: What can it tell us about the specificity and origin of face-processing mechanisms?
    Schmalzl, Laura
    Palermo, Romina
    Harris, Irina M.
    Coltheart, Max
    COGNITIVE NEUROPSYCHOLOGY, 2009, 26 (03) : 286 - 306
  • [35] What brain connectivity patterns from EEG tell us about hearing loss: A graph theoretic approach
    Mahmud, Md Sultan
    Yeasin, Mohammed
    Shen, Dawei
    Arnott, Stephen R.
    Alain, Claude
    Bidelman, Gavin M.
    2018 10TH INTERNATIONAL CONFERENCE ON ELECTRICAL AND COMPUTER ENGINEERING (ICECE), 2018, : 205 - 208
  • [36] What the brain's intrinsic activity can tell us about consciousness? A tri-dimensional view
    Northoff, Georg
    NEUROSCIENCE AND BIOBEHAVIORAL REVIEWS, 2013, 37 (04) : 726 - 738
  • [37] Self-explaining roads: What does visual cognition tell us about designing safer roads?
    Theeuwes, Jan
    COGNITIVE RESEARCH-PRINCIPLES AND IMPLICATIONS, 2021, 6 (01)
  • [38] Left Lateralization of the Cortical Auditory-Evoked Potential Reflects Aided Processing and Speech-in-Noise Performance of Older Listeners With a Hearing Loss
    Slugocki, Christopher
    Kuk, Francis
    Korhonen, Petri
    EAR AND HEARING, 2023, 44 (02) : 399 - 410
  • [39] What can Information Extraction from Scenes and Causal Systems Tell us about Learning from Text and Pictures?
    Eitel, Alexander
    Scheiter, Katharina
    Schueler, Anne
    COGNITION IN FLUX, 2010, : 2822 - 2827
  • [40] What do wild saiga antelopes tell us about the relative roles of the two brain hemispheres in social interactions?
    Giljov, Andrey
    Malashichev, Yegor
    Karenina, Karina
    ANIMAL COGNITION, 2019, 22 (05) : 635 - 643