No "Self" Advantage for Audiovisual Speech Aftereffects

被引:0
作者
Modelska, Maria [1 ]
Pourquie, Marie [1 ,2 ]
Baart, Martijn [1 ,3 ]
机构
[1] BCBL Basque Ctr Cognit Brain & Language, Donostia San Sebastian, Spain
[2] UPPA, IKER UMR5478, Bayonne, France
[3] Tilburg Univ, Dept Cognit Neuropsychol, Tilburg, Netherlands
来源
FRONTIERS IN PSYCHOLOGY | 2019年 / 10卷
关键词
speech perception; self-advantage; recalibration; adaptation; lip-reading; SELECTIVE ADAPTATION; VISUAL SPEECH; ELECTROPHYSIOLOGICAL EVIDENCE; PHONETIC RECALIBRATION; AUDITORY SPEECH; HEARING-LIPS; PERCEPTION; IDENTIFICATION; INFORMATION; LISTENERS;
D O I
10.3389/fpsyg.2019.00658
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
Although the default state of the world is that we see and hear other people talking, there is evidence that seeing and hearing ourselves rather than someone else may lead to visual (i.e., lip-read) or auditory "self" advantages. We assessed whether there is a "self" advantage for phonetic recalibration (a lip-read driven cross-modal learning effect) and selective adaptation (a contrastive effect in the opposite direction of recalibration). We observed both aftereffects as well as an on-line effect of lip-read information on auditory perception (i.e., immediate capture), but there was no evidence for a "self" advantage in any of the tasks (as additionally supported by Bayesian statistics). These findings strengthen the emerging notion that recalibration reflects a general learning mechanism, and bolster the argument that adaptation depends on rather low-level auditory/acoustic features of the speech signal.
引用
收藏
页数:10
相关论文
共 50 条
  • [21] Hierarchically nested networks optimize the analysis of audiovisual speech
    Chalas, Nikos
    Omigie, Diana
    Poeppel, David
    van Wassenhove, Virginie
    ISCIENCE, 2023, 26 (03)
  • [22] Speech-specific audiovisual integration modulates induced theta-band oscillations
    Lindborg, Alma
    Baart, Martijn
    Stekelenburg, Jeroen J.
    Vroomen, Jean
    Andersen, Tobias S.
    PLOS ONE, 2019, 14 (07):
  • [23] Audiovisual speech asynchrony asymmetrically modulates neural binding
    Sato, Marc
    NEUROPSYCHOLOGIA, 2024, 198
  • [24] Audiovisual integration of speech in a patient with Broca's Aphasia
    Andersen, Tobias S.
    Starrfelt, Randi
    FRONTIERS IN PSYCHOLOGY, 2015, 6
  • [25] Audiovisual speech synthesis: An overview of the state-of-the-art
    Mattheyses, Wesley
    Verhelst, Werner
    SPEECH COMMUNICATION, 2015, 66 : 182 - 217
  • [26] Visemic processing in audiovisual discrimination of natural speech: A simultaneous fMRI-EEG study
    Dubois, Cyril
    Otzenberger, Helene
    Gounot, Daniel
    Sock, Rudolph
    Metz-Lutz, Marie-Noelle
    NEUROPSYCHOLOGIA, 2012, 50 (07) : 1316 - 1326
  • [27] Effects of age and left hemisphere lesions on audiovisual integration of speech
    Michaelis, Kelly
    Erickson, Laura C.
    Fama, Mackenzie E.
    Skipper-Kallal, Laura M.
    Xing, Shihui
    Lacey, Elizabeth H.
    Anbari, Zainab
    Norato, Gina
    Rauschecker, Josef P.
    Turkeltaub, Peter E.
    BRAIN AND LANGUAGE, 2020, 206
  • [28] Causal inference and temporal predictions in audiovisual perception of speech and music
    Noppeney, Uta
    Lee, Hwee Ling
    ANNALS OF THE NEW YORK ACADEMY OF SCIENCES, 2018, 1423 (01) : 102 - 116
  • [29] A possible neurophysiological correlate of audiovisual binding and unbinding in speech perception
    Ganesh, Attigodu C.
    Berthommier, Frederic
    Vilain, Coriandre
    Sato, Marc
    Schwartz, Jean-Luc
    FRONTIERS IN PSYCHOLOGY, 2014, 5
  • [30] Temporal synchrony and audiovisual integration of speech and object stimuli in autism
    Smith, Elizabeth
    Zhang, Shouling
    Bennetto, Loisa
    RESEARCH IN AUTISM SPECTRUM DISORDERS, 2017, 39 : 11 - 19