Processing of Visual Speech Cues in Speech-in-Noise Comprehension Depends on Working Memory Capacity and Enhances Neural Speech Tracking in Older Adults With Hearing Impairment

被引:1
作者
Frei, Vanessa [1 ,2 ]
Schmitt, Raffael [1 ,2 ,3 ,4 ]
Meyer, Martin [3 ,4 ,5 ,6 ,7 ,8 ]
Giroud, Nathalie [1 ,2 ,3 ,4 ,6 ]
机构
[1] Univ Zurich, Dept Computat Linguist, Computat Neurosci Speech & Hearing, Zurich, Switzerland
[2] Life Course Evolutionary & Ontogenet Dynam LIFE, Int Max Planck Res Sch, Berlin, Germany
[3] Univ Zurich, Ctr Med Fac, Competence Ctr Language & Med, Zurich, Switzerland
[4] Univ Zurich, Fac Arts & Sci, Zurich, Switzerland
[5] Univ Zurich, Univ Res Prior Program Dynam Hlth Aging, Zurich, Switzerland
[6] Univ & ETH Zurich, Ctr Neurosci Zurich, Zurich, Switzerland
[7] Univ Zurich, Dept Comparat Language Sci, Evolutionary Neurosci Language, Zurich, Switzerland
[8] Alpen Adria Univ, Cognit Psychol Unit, Klagenfurt, Austria
来源
TRENDS IN HEARING | 2024年 / 28卷
基金
瑞士国家科学基金会;
关键词
neural speech tracking; audio-visual speech; age-related hearing loss; EEG; working memory capacity; speech in noise; AUDITORY-CORTEX; AUDIOVISUAL INTERACTIONS; COGNITIVE FUNCTION; COMPETING SPEECH; LISTENING EFFORT; PERCEPTION; INTEGRATION; OSCILLATIONS; PHASE; RECOGNITION;
D O I
10.1177/23312165241287622
中图分类号
R36 [病理学]; R76 [耳鼻咽喉科学];
学科分类号
100104 ; 100213 ;
摘要
Comprehending speech in noise (SiN) poses a challenge for older hearing-impaired listeners, requiring auditory and working memory resources. Visual speech cues provide additional sensory information supporting speech understanding, while the extent of such visual benefit is characterized by large variability, which might be accounted for by individual differences in working memory capacity (WMC). In the current study, we investigated behavioral and neurofunctional (i.e., neural speech tracking) correlates of auditory and audio-visual speech comprehension in babble noise and the associations with WMC. Healthy older adults with hearing impairment quantified by pure-tone hearing loss (threshold average: 31.85-57 dB, N = 67) listened to sentences in babble noise in audio-only, visual-only and audio-visual speech modality and performed a pattern matching and a comprehension task, while electroencephalography (EEG) was recorded. Behaviorally, no significant difference in task performance was observed across modalities. However, we did find a significant association between individual working memory capacity and task performance, suggesting a more complex interplay between audio-visual speech cues, working memory capacity and real-world listening tasks. Furthermore, we found that the visual speech presentation was accompanied by increased cortical tracking of the speech envelope, particularly in a right-hemispheric auditory topographical cluster. Post-hoc, we investigated the potential relationships between the behavioral performance and neural speech tracking but were not able to establish a significant association. Overall, our results show an increase in neurofunctional correlates of speech associated with congruent visual speech cues, specifically in a right auditory cluster, suggesting multisensory integration.
引用
收藏
页数:20
相关论文
共 118 条
[1]   Right-hemisphere auditory cortex is dominant for coding syllable patterns in speech [J].
Abrams, Daniel A. ;
Nicol, Trent ;
Zecker, Steven ;
Kraus, Nina .
JOURNAL OF NEUROSCIENCE, 2008, 28 (15) :3958-3965
[2]   Are individual differences in speech reception related to individual differences in cognitive ability? A survey of twenty experimental studies with normal and hearing-impaired adults [J].
Akeroyd, Michael A. .
INTERNATIONAL JOURNAL OF AUDIOLOGY, 2008, 47 :S53-S71
[3]   Differential Auditory and Visual Phase-Locking Are Observed during Audio-Visual Benefit and Silent Lip-Reading for Speech Perception [J].
Aller, Mate ;
Okland, Heidi Solberg ;
MacGregor, Lucy J. ;
Blank, Helen ;
Davis, Matthew H. .
JOURNAL OF NEUROSCIENCE, 2022, 42 (31) :6108-6120
[4]   Hearing impairment and audiovisual speech integration ability: a case study report [J].
Altieri, Nicholas ;
Hudock, Daniel .
FRONTIERS IN PSYCHOLOGY, 2014, 5
[5]   A dynamic auditory-cognitive system supports speech-in-noise perception in older adults [J].
Anderson, Samira ;
White-Schwoch, Travis ;
Parbery-Clark, Alexandra ;
Kraus, Nina .
HEARING RESEARCH, 2013, 300 :18-32
[6]   A Neural Basis of Speech-in-Noise Perception in Older Adults [J].
Anderson, Samira ;
Parbery-Clark, Alexandra ;
Yi, Han-Gyol ;
Kraus, Nina .
EAR AND HEARING, 2011, 32 (06) :750-757
[7]   Cochlear outer hair cell motility [J].
Ashmore, Jonathan .
PHYSIOLOGICAL REVIEWS, 2008, 88 (01) :173-210
[8]   Integration of Visual Information in Auditory Cortex Promotes Auditory Scene Analysis through Multisensory Binding [J].
Atilgan, Huriye ;
Town, Stephen M. ;
Wood, Katherine C. ;
Jones, Gareth P. ;
Maddox, Ross K. ;
Lee, Adrian K. C. ;
Bizley, Jennifer K. .
NEURON, 2018, 97 (03) :640-+
[9]   Electrophysiological evidence for speech-specific audiovisual integration [J].
Baart, Martijn ;
Stekelenburg, Jeroen J. ;
Vroomen, Jean .
NEUROPSYCHOLOGIA, 2014, 53 :115-121
[10]   Random effects structure for confirmatory hypothesis testing: Keep it maximal [J].
Barr, Dale J. ;
Levy, Roger ;
Scheepers, Christoph ;
Tily, Harry J. .
JOURNAL OF MEMORY AND LANGUAGE, 2013, 68 (03) :255-278