Electrophysiological evidence for speech-specific audiovisual integration

被引:74
|
作者
Baart, Martijn [1 ,2 ]
Stekelenburg, Jeroen J. [2 ]
Vroomen, Jean [2 ]
机构
[1] Basque Ctr Cognit Brain & Language, Donostia San Sebastian 20009, Spain
[2] Tilburg Univ, Dept Cognit Neuropsychol, POB 90153,Warandelaan 2, NL-5000 LE Tilburg, Netherlands
关键词
N1; P2; Audiovisual speech; Sine-wave speech; Audiovisual integration; AUDITORY-VISUAL INTERACTIONS; MULTISENSORY INTEGRATION; SELECTIVE-ATTENTION; MISMATCH NEGATIVITY; PERCEPTION; INFORMATION; HUMANS; WINDOW; MODE;
D O I
10.1016/j.neuropsychologia.2013.11.011
中图分类号
B84 [心理学]; C [社会科学总论]; Q98 [人类学];
学科分类号
03 ; 0303 ; 030303 ; 04 ; 0402 ;
摘要
Lip-read speech is integrated with heard speech at various neural levels. Here, we investigated the extent to which lip-read induced modulations of the auditory N1 and P2 (measured with EEG) are indicative of speech-specific audiovisual integration; and we explored to what extent the ERPs were modulated by phonetic audiovisual congruency. In order to disentangle speech-specific (phonetic) integration from non-speech integration, we used Sine-Wave Speech (SWS) that was perceived as speech by half of the participants (they were in speech-mode), while the other half was in non-speech mode. Results showed that the N1 obtained with audiovisual stimuli peaked earlier than the N1 evoked by auditory-only stimuli. This lip-read induced speeding up of the N1 occurred for listeners in speech and non-speech mode. In contrast, if listeners were in speech-mode, lip-read speech also modulated the auditory P2, but not if listeners were in non-speech mode, thus revealing speech-specific audiovisual binding. Comparing ERPs for phonetically congruent audiovisual stimuli with ERPs for incongruent stimuli revealed an effect of phonetic stimulus congruency that started at similar to 200 ms after (in)congruence became apparent. Critically, akin to the P2 suppression, congruency effects were only observed if listeners were in speech mode, and not if they were in non-speech mode. Using identical stimuli, we thus confirm that audiovisual binding involves (partially) different neural mechanisms for sound processing in speech and non-speech mode. (C) 2013 Elsevier Ltd. All rights reserved.
引用
收藏
页码:115 / 121
页数:7
相关论文
共 50 条
  • [41] Audiovisual integration of speech: evidence for increased accuracy in “talk” versus “listen” condition
    Lefteris Themelis Zografos
    Anna Konstantoulaki
    Christoph Klein
    Argiro Vatakis
    Nikolaos Smyrnis
    Experimental Brain Research, 2025, 243 (6)
  • [42] Formant-Frequency Variation and Informational Masking of Speech by Extraneous Formants: Evidence Against Dynamic and Speech-Specific Acoustical Constraints
    Roberts, Brian
    Summers, Robert J.
    Bailey, Peter J.
    JOURNAL OF EXPERIMENTAL PSYCHOLOGY-HUMAN PERCEPTION AND PERFORMANCE, 2014, 40 (04) : 1507 - 1525
  • [43] The role of speech-specific properties of the background in the irrelevant sound effect
    Viswanathan, Navin
    Dorsi, Josh
    George, Stephanie
    QUARTERLY JOURNAL OF EXPERIMENTAL PSYCHOLOGY, 2014, 67 (03) : 581 - 589
  • [44] AUDIOVISUAL SPEECH INTEGRATION IN THE SUPERIOR TEMPORAL REGION IS DYSFUNCTIONAL IN DYSLEXIA
    Ye, Zheng
    Ruesseler, Jascha
    Gerth, Ivonne
    Muente, Thomas F.
    NEUROSCIENCE, 2017, 356 : 1 - 10
  • [45] Electrophysiological Indices of Audiovisual Speech Perception: Beyond the McGurk Effect and Speech in Noise
    Irwin, Julia
    Avery, Trey
    Brancazio, Lawrence
    Turcios, Jacqueline
    Ryherd, Kayleigh
    Landi, Nicole
    MULTISENSORY RESEARCH, 2018, 31 (1-2) : 39 - 56
  • [46] Exposure to asynchronous audiovisual speech extends the temporal window for audiovisual integration
    Navarra, J
    Vatakis, A
    Zampini, M
    Soto-Faraco, S
    Humphreys, W
    Spence, C
    COGNITIVE BRAIN RESEARCH, 2005, 25 (02): : 499 - 507
  • [47] Audiovisual Speech Integration Does Not Rely on the Motor System: Evidence from Articulatory Suppression, the McGurk Effect, and fMRI
    Matchin, William
    Groulx, Kier
    Hickok, Gregory
    JOURNAL OF COGNITIVE NEUROSCIENCE, 2014, 26 (03) : 606 - 620
  • [48] Attention to touch weakens audiovisual speech integration
    Alsius, Agnes
    Navarra, Jordi
    Soto-Faraco, Salvador
    EXPERIMENTAL BRAIN RESEARCH, 2007, 183 (03) : 399 - 404
  • [49] Speech and non-speech measures of audiovisual integration are not correlated
    Jonathan M. P. Wilbiks
    Violet A. Brown
    Julia F. Strand
    Attention, Perception, & Psychophysics, 2022, 84 : 1809 - 1819
  • [50] Attention to touch weakens audiovisual speech integration
    Agnès Alsius
    Jordi Navarra
    Salvador Soto-Faraco
    Experimental Brain Research, 2007, 183 : 399 - 404