共 114 条
Contributions of local speech encoding and functional connectivity to audio-visual speech perception
被引:55
作者:
Giordano, Bruno L.
[1
,2
]
Ince, Robin A. A.
[2
]
Gross, Joachim
[2
]
Schyns, Philippe G.
[2
]
Panzeri, Stefano
[3
]
Kayser, Christoph
[2
]
机构:
[1] Aix Marseille Univ, CNRS, UMR 7289, Inst Neurosci Timone, Marseille, France
[2] Univ Glasgow, Inst Neurosci & Psychol, Glasgow, Lanark, Scotland
[3] Ist Italiano Tecnol, Ctr Neurosci & Cognit Syst, Neural Computat Lab, Rovereto, Italy
来源:
基金:
欧洲研究理事会;
英国工程与自然科学研究理事会;
英国惠康基金;
英国生物技术与生命科学研究理事会;
关键词:
MULTISENSORY INTEGRATION;
AUDITORY-CORTEX;
PREMOTOR CORTEX;
VISUAL SPEECH;
CORTICAL REPRESENTATION;
NEURONAL OSCILLATIONS;
RIGHT-HEMISPHERE;
DEGRADED SPEECH;
DYNAMIC FACES;
BRAIN NETWORK;
D O I:
10.7554/eLife.24763
中图分类号:
Q [生物科学];
学科分类号:
07 ;
0710 ;
09 ;
摘要:
Seeing a speaker's face enhances speech intelligibility in adverse environments. We investigated the underlying network mechanisms by quantifying local speech representations and directed connectivity in MEG data obtained while human participants listened to speech of varying acoustic SNR and visual context. During high acoustic SNR speech encoding by temporally entrained brain activity was strong in temporal and inferior frontal cortex, while during low SNR strong entrainment emerged in premotor and superior frontal cortex. These changes in local encoding were accompanied by changes in directed connectivity along the ventral stream and the auditory-premotor axis. Importantly, the behavioral benefit arising from seeing the speaker's face was not predicted by changes in local encoding but rather by enhanced functional connectivity between temporal and inferior frontal cortex. Our results demonstrate a role of auditory-frontal interactions in visual speech representations and suggest that functional connectivity along the ventral pathway facilitates speech comprehension in multisensory environments.
引用
收藏
页数:27
相关论文