No "Self" Advantage for Audiovisual Speech Aftereffects

被引:0
作者
Modelska, Maria [1 ]
Pourquie, Marie [1 ,2 ]
Baart, Martijn [1 ,3 ]
机构
[1] BCBL Basque Ctr Cognit Brain & Language, Donostia San Sebastian, Spain
[2] UPPA, IKER UMR5478, Bayonne, France
[3] Tilburg Univ, Dept Cognit Neuropsychol, Tilburg, Netherlands
来源
FRONTIERS IN PSYCHOLOGY | 2019年 / 10卷
关键词
speech perception; self-advantage; recalibration; adaptation; lip-reading; SELECTIVE ADAPTATION; VISUAL SPEECH; ELECTROPHYSIOLOGICAL EVIDENCE; PHONETIC RECALIBRATION; AUDITORY SPEECH; HEARING-LIPS; PERCEPTION; IDENTIFICATION; INFORMATION; LISTENERS;
D O I
10.3389/fpsyg.2019.00658
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
Although the default state of the world is that we see and hear other people talking, there is evidence that seeing and hearing ourselves rather than someone else may lead to visual (i.e., lip-read) or auditory "self" advantages. We assessed whether there is a "self" advantage for phonetic recalibration (a lip-read driven cross-modal learning effect) and selective adaptation (a contrastive effect in the opposite direction of recalibration). We observed both aftereffects as well as an on-line effect of lip-read information on auditory perception (i.e., immediate capture), but there was no evidence for a "self" advantage in any of the tasks (as additionally supported by Bayesian statistics). These findings strengthen the emerging notion that recalibration reflects a general learning mechanism, and bolster the argument that adaptation depends on rather low-level auditory/acoustic features of the speech signal.
引用
收藏
页数:10
相关论文
共 50 条
[31]   A possible neurophysiological correlate of audiovisual binding and unbinding in speech perception [J].
Ganesh, Attigodu C. ;
Berthommier, Frederic ;
Vilain, Coriandre ;
Sato, Marc ;
Schwartz, Jean-Luc .
FRONTIERS IN PSYCHOLOGY, 2014, 5
[32]   Audiovisual Asynchrony Detection in Human Speech [J].
Maier, Joost X. ;
Di Luca, Massimiliano ;
Noppeney, Uta .
JOURNAL OF EXPERIMENTAL PSYCHOLOGY-HUMAN PERCEPTION AND PERFORMANCE, 2011, 37 (01) :245-256
[33]   Audiovisual speech facilitates voice learning [J].
Sonya M. Sheffert ;
Elizabeth Olson .
Perception & Psychophysics, 2004, 66 :352-362
[34]   Gaze Patterns and Audiovisual Speech Enhancement [J].
Yi, Astrid ;
Wong, Willy ;
Eizenman, Moshe .
JOURNAL OF SPEECH LANGUAGE AND HEARING RESEARCH, 2013, 56 (02) :471-480
[35]   Audiovisual Integration of Speech in a Bistable Illusion [J].
Munhall, K. G. ;
ten Hove, M. W. ;
Brammer, M. ;
Pare, M. .
CURRENT BIOLOGY, 2009, 19 (09) :735-739
[36]   Prediction and constraint in audiovisual speech perception [J].
Peelle, Jonathan E. ;
Sommers, Mitchell S. .
CORTEX, 2015, 68 :169-181
[37]   Speech and non-speech measures of audiovisual integration are not correlated [J].
Wilbiks, Jonathan M. P. ;
Brown, Violet A. ;
Strand, Julia F. .
ATTENTION PERCEPTION & PSYCHOPHYSICS, 2022, 84 (06) :1809-1819
[38]   Audiovisual Speech Processing in Relationship to Phonological and Vocabulary Skills in First Graders [J].
Gijbels, Liesbeth ;
Yeatman, Jason D. ;
Lalonde, Kaylah ;
Leea, Adrian K. C. .
JOURNAL OF SPEECH LANGUAGE AND HEARING RESEARCH, 2021, 64 (12) :5022-5040
[39]   Psychobiological Responses Reveal Audiovisual Noise Differentially Challenges Speech Recognition [J].
Bidelman, Gavin M. ;
Brown, Bonnie ;
Mankel, Kelsey ;
Price, Caitlin .
EAR AND HEARING, 2020, 41 (02) :268-277
[40]   Integration and Temporal Processing of Asynchronous Audiovisual Speech [J].
Simon, David M. ;
Wallace, Mark T. .
JOURNAL OF COGNITIVE NEUROSCIENCE, 2018, 30 (03) :319-337