Electrophysiological evidence for speech-specific audiovisual integration

被引:74
|
作者
Baart, Martijn [1 ,2 ]
Stekelenburg, Jeroen J. [2 ]
Vroomen, Jean [2 ]
机构
[1] Basque Ctr Cognit Brain & Language, Donostia San Sebastian 20009, Spain
[2] Tilburg Univ, Dept Cognit Neuropsychol, POB 90153,Warandelaan 2, NL-5000 LE Tilburg, Netherlands
关键词
N1; P2; Audiovisual speech; Sine-wave speech; Audiovisual integration; AUDITORY-VISUAL INTERACTIONS; MULTISENSORY INTEGRATION; SELECTIVE-ATTENTION; MISMATCH NEGATIVITY; PERCEPTION; INFORMATION; HUMANS; WINDOW; MODE;
D O I
10.1016/j.neuropsychologia.2013.11.011
中图分类号
B84 [心理学]; C [社会科学总论]; Q98 [人类学];
学科分类号
03 ; 0303 ; 030303 ; 04 ; 0402 ;
摘要
Lip-read speech is integrated with heard speech at various neural levels. Here, we investigated the extent to which lip-read induced modulations of the auditory N1 and P2 (measured with EEG) are indicative of speech-specific audiovisual integration; and we explored to what extent the ERPs were modulated by phonetic audiovisual congruency. In order to disentangle speech-specific (phonetic) integration from non-speech integration, we used Sine-Wave Speech (SWS) that was perceived as speech by half of the participants (they were in speech-mode), while the other half was in non-speech mode. Results showed that the N1 obtained with audiovisual stimuli peaked earlier than the N1 evoked by auditory-only stimuli. This lip-read induced speeding up of the N1 occurred for listeners in speech and non-speech mode. In contrast, if listeners were in speech-mode, lip-read speech also modulated the auditory P2, but not if listeners were in non-speech mode, thus revealing speech-specific audiovisual binding. Comparing ERPs for phonetically congruent audiovisual stimuli with ERPs for incongruent stimuli revealed an effect of phonetic stimulus congruency that started at similar to 200 ms after (in)congruence became apparent. Critically, akin to the P2 suppression, congruency effects were only observed if listeners were in speech mode, and not if they were in non-speech mode. Using identical stimuli, we thus confirm that audiovisual binding involves (partially) different neural mechanisms for sound processing in speech and non-speech mode. (C) 2013 Elsevier Ltd. All rights reserved.
引用
收藏
页码:115 / 121
页数:7
相关论文
共 50 条
  • [1] Electrophysiological evidence for a multisensory speech-specific mode of perception
    Stekelenburg, Jeroen J.
    Vroomen, Jean
    NEUROPSYCHOLOGIA, 2012, 50 (07) : 1425 - 1431
  • [2] Speech-specific audiovisual integration modulates induced theta-band oscillations
    Lindborg, Alma
    Baart, Martijn
    Stekelenburg, Jeroen J.
    Vroomen, Jean
    Andersen, Tobias S.
    PLOS ONE, 2019, 14 (07):
  • [3] Electrophysiological evidence for a self-processing advantage during audiovisual speech integration
    Treille, Avril
    Vilain, Coriandre
    Kandel, Sonia
    Sato, Marc
    EXPERIMENTAL BRAIN RESEARCH, 2017, 235 (09) : 2867 - 2876
  • [4] Audiovisual integration for speech during mid-childhood: Electrophysiological evidence
    Kaganovich, Natalya
    Schumaker, Jennifer
    BRAIN AND LANGUAGE, 2014, 139 : 36 - 48
  • [5] Electrophysiological Evidence of Nonspatial Inhibition of Return Affecting Audiovisual Integration
    Wang, Xiaoxue
    Tang, Xiaoyu
    Wang, Aijun
    Zhang, Ming
    NEUROPSYCHOLOGY, 2025, 39 (01) : 76 - 88
  • [6] Electrophysiological evidence for Audio-visuo-lingual speech integration
    Treille, Avril
    Vilain, Coriandre
    Schwartz, Jean-Luc
    Hueber, Thomas
    Sato, Marc
    NEUROPSYCHOLOGIA, 2018, 109 : 126 - 133
  • [7] Electrophysiological evidence for a self-processing advantage during audiovisual speech integration
    Avril Treille
    Coriandre Vilain
    Sonia Kandel
    Marc Sato
    Experimental Brain Research, 2017, 235 : 2867 - 2876
  • [8] The effect of face orientation on audiovisual speech integration in infancy: An electrophysiological study
    Szmytke, Magdalena
    Ilyka, Dianna
    Duda-Golawska, Joanna
    Laudanska, Zuzanna
    Malinowska-Korczak, Anna
    Tomalski, Przemyslaw
    DEVELOPMENTAL PSYCHOBIOLOGY, 2023, 65 (07)
  • [9] Neural mechanisms for the effect of prior knowledge on audiovisual integration
    Liu, Qiang
    Zhang, Ye
    Campos, Jennifer L.
    Zhang, Qinglin
    Sun, Hong-Jin
    BIOLOGICAL PSYCHOLOGY, 2011, 87 (02) : 200 - 208
  • [10] Multistage audiovisual integration of speech: dissociating identification and detection
    Eskelund, Kasper
    Tuomainen, Jyrki
    Andersen, Tobias S.
    EXPERIMENTAL BRAIN RESEARCH, 2011, 208 (03) : 447 - 457