Auditory enhancement of visual perception at threshold depends on visual abilities

被引:20
|
作者
Caclin, Anne [1 ,2 ,3 ]
Bouchet, Patrick [1 ,2 ,3 ]
Djoulah, Farida [1 ,2 ,3 ]
Pirat, Elodie [1 ,2 ,3 ]
Pernier, Jacques [1 ,2 ,3 ]
Giard, Marie-Helene [1 ,2 ,3 ]
机构
[1] INSERM, Lyon Neurosci Res Ctr, Brain Dynam & Cognit Team, U1028, F-69000 Lyon, France
[2] CNRS, Lyon Neurosci Res Ctr, Brain Dynam & Cognit Team, UMR5292, F-69000 Lyon, France
[3] Univ Lyon 1, F-69000 Lyon, France
关键词
Audiovisual interactions; Detection threshold; Collinear facilitation; Inter-individual variability; CONVERGING AUDIOVISUAL INPUTS; SPATIAL INTERACTIONS; PSYCHOPHYSICAL ANALYSIS; CONTRAST DETECTION; ACOUSTICAL VISION; STIMULATION; SENSITIVITY; INTEGRATION; STIMULUS; DISCRIMINATION;
D O I
10.1016/j.brainres.2011.04.016
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
Whether or not multisensory interactions can improve detection thresholds, and thus widen the range of perceptible events is a long-standing debate. Here we revisit this question, by testing the influence of auditory stimuli on visual detection threshold, in subjects exhibiting a wide range of visual-only performance. Above the perceptual threshold, crossmodal interactions have indeed been reported to depend on the subject's performance when the modalities are presented in isolation. We thus tested normal-seeing subjects and short-sighted subjects wearing their usual glasses. We used a paradigm limiting potential shortcomings of previous studies: we chose a criterion-free threshold measurement procedure and precluded exogenous cueing effects by systematically presenting a visual cue whenever a visual target (a faint Gabor patch) might occur. Using this carefully controlled procedure, we found that concurrent sounds only improved visual detection thresholds in the sub-group of subjects exhibiting the poorest performance in the visual-only conditions. In these subjects, for oblique orientations of the visual stimuli (but not for vertical or horizontal targets), the auditory improvement was still present when visual detection was already helped with flanking visual stimuli generating a collinear facilitation effect. These findings highlight that crossmodal interactions are most efficient to improve perceptual performance when an isolated modality is deficient. (C) 2011 Elsevier B.V. All rights reserved.
引用
收藏
页码:35 / 44
页数:10
相关论文
共 50 条
  • [31] Audiovisual Speech Perception in Infancy: The Influence of Vowel Identity and Infants' Productive Abilities on Sensitivity to (Mis)Matches Between Auditory and Visual Speech Cues
    Altvater-Mackensen, Nicole
    Mani, Nivedita
    Grossmann, Tobias
    DEVELOPMENTAL PSYCHOLOGY, 2016, 52 (02) : 191 - 204
  • [32] Age-related changes in auditory and visual interactions in temporal rate perception
    Brooks, Cassandra J.
    Anderson, Andrew J.
    Roach, Neil W.
    McGraw, Paul V.
    McKendrick, Allison M.
    JOURNAL OF VISION, 2015, 15 (16):
  • [33] Effects of the Simultaneous Presentation of Corresponding Auditory and Visual Stimuli on Size Variance Perception
    Ueda, Sachiyo
    Mizuguchi, Ayane
    Yakushijin, Reiko
    Ishiguchi, Akira
    I-PERCEPTION, 2018, 9 (06):
  • [34] The speed and temporal frequency of visual apparent motion modulate auditory duration perception
    He, Xiang
    Ke, Zijun
    Wu, Zehua
    Chen, Lihan
    Yue, Zhenzhu
    SCIENTIFIC REPORTS, 2023, 13 (01)
  • [35] Visual and auditory brain areas share a representational structure that supports emotion perception
    Sievers, Beau
    Parkinson, Carolyn
    Kohler, Peter J.
    Hughes, James M.
    Fogelson, Sergey, V
    Wheatley, Thalia
    CURRENT BIOLOGY, 2021, 31 (23) : 5192 - +
  • [36] Influence of Visual Deprivation on Auditory Spectral Resolution, Temporal Resolution, and Speech Perception
    Shim, Hyun Joon
    Go, Geurim
    Lee, Heirim
    Choi, Sung Won
    Won, Jong Ho
    FRONTIERS IN NEUROSCIENCE, 2019, 13
  • [37] Similar effect of running on visual and auditory time perception in the ranges of milliseconds and seconds
    Petrizzo, Irene
    Chelli, Eleonora
    Bartolini, Tommaso
    Arrighi, Roberto
    Anobile, Giovanni
    FRONTIERS IN PSYCHOLOGY, 2023, 14
  • [38] Use of cues in virtual reality depends on visual feedback
    Fulvio, Jacqueline M.
    Rokers, Bas
    SCIENTIFIC REPORTS, 2017, 7
  • [39] Form and Function in Information for Visual Perception
    Lappin, Joseph S.
    Bell, Herbert H.
    I-PERCEPTION, 2021, 12 (06):
  • [40] Development and validation of a questionnaire for assessing visual and auditory spatial localization abilities in dual sensory impairment
    Xiong, Yingzi
    Nemargut, Joseph Paul
    Bradley, Chris
    Wittich, Walter
    Legge, Gordon E.
    SCIENTIFIC REPORTS, 2024, 14 (01)