Motor and visual influences on auditory neural processing during speaking and listening

被引:3
作者
Sato, Marc [1 ,2 ]
机构
[1] Aix Marseille Univ, Ctr Natl Rech Sci, Lab Parole & Langage, Aix En Provence, France
[2] Aix Marseille Univ, Lab Parole & Langage, UMR 7309, CNRS, 5 Ave Pasteur, F-13100 Aix En Provence, France
关键词
Speech production; Audiovisual speech perception; Speaking-induced suppression; Efference copy; Corollary discharge; Readiness potential; EEG; COROLLARY DISCHARGE DYSFUNCTION; ELECTROPHYSIOLOGICAL EVIDENCE; MULTISENSORY INTEGRATION; LIP-READ; N1; WAVE; SPEECH; CORTEX; POTENTIALS; SOUND; EEG;
D O I
10.1016/j.cortex.2022.03.013
中图分类号
B84 [心理学]; C [社会科学总论]; Q98 [人类学];
学科分类号
03 ; 0303 ; 030303 ; 04 ; 0402 ;
摘要
During speaking or listening, endogenous motor or exogenous visual processes have been shown to fine-tune the auditory neural processing of incoming acoustic speech signal. To compare the impact of these cross-modal effects on auditory evoked responses, two sets of speech production and perception tasks were contrasted using EEG. In a first set, partici-pants produced vowels in a self-paced manner while listening to their auditory feedback. Following the production task, they passively listened to the entire recorded speech sequence. In a second set, the procedure was identical except that participants also watched online their own articulatory movements. While both endogenous motor and exogenous visual processes fine-tuned auditory neural processing, these cross-modal ef-fects were found to act differentially on the amplitude and latency of auditory evoked responses. A reduced amplitude was observed on auditory evoked responses during speaking compared to listening, irrespective of the auditory or audiovisual feedback. Adding orofacial visual movements to the acoustic speech signal also speeded up the la-tency of auditory evoked responses, irrespective of the perception or production task. Taken together, these results suggest distinct motor and visual influences on auditory neural processing, possibly through different neural gating and predictive mechanisms. (c) 2022 Elsevier Ltd. All rights reserved.
引用
收藏
页码:21 / 35
页数:15
相关论文
共 85 条
[1]  
[Anonymous], 2021, Praat: doing phonetics by computer Computer program
[2]   Dual Neural Routing of Visual Facilitation in Speech Processing [J].
Arnal, Luc H. ;
Morillon, Benjamin ;
Kell, Christian A. ;
Giraud, Anne-Lise .
JOURNAL OF NEUROSCIENCE, 2009, 29 (43) :13445-13453
[3]   Quantifying lip-read-induced suppression and facilitation of the auditory N1 and P2 reveals peak enhancements and delays [J].
Baart, Martijn .
PSYCHOPHYSIOLOGY, 2016, 53 (09) :1295-1306
[4]   Turning a blind eye to the lexicon: ERPs show no cross-talk between lip-read and lexical context during speech sound processing [J].
Baart, Martijn ;
Samuel, Arthur G. .
JOURNAL OF MEMORY AND LANGUAGE, 2015, 85 :42-59
[5]   Electrophysiological evidence for speech-specific audiovisual integration [J].
Baart, Martijn ;
Stekelenburg, Jeroen J. ;
Vroomen, Jean .
NEUROPSYCHOLOGIA, 2014, 53 :115-121
[6]   Error-dependent modulation of speech-induced auditory suppression for pitch-shifted voice feedback [J].
Behroozmand, Roozbeh ;
Larson, Charles R. .
BMC NEUROSCIENCE, 2011, 12
[7]   Bimodal speech: early suppressive visual effects in human auditory cortex [J].
Besle, J ;
Fort, A ;
Delpuech, C ;
Giard, MH .
EUROPEAN JOURNAL OF NEUROSCIENCE, 2004, 20 (08) :2225-2234
[8]   SLOW POTENTIALS OF THE CEREBRAL-CORTEX AND BEHAVIOR [J].
BIRBAUMER, N ;
ELBERT, T ;
CANAVAN, AGM ;
ROCKSTROH, B .
PHYSIOLOGICAL REVIEWS, 1990, 70 (01) :1-41
[9]  
Calliope, 1989, La parole et son traitement automatique
[10]   The Natural Statistics of Audiovisual Speech [J].
Chandrasekaran, Chandramouli ;
Trubanova, Andrea ;
Stillittano, Sebastien ;
Caplier, Alice ;
Ghazanfar, Asif A. .
PLOS COMPUTATIONAL BIOLOGY, 2009, 5 (07)