Spatial variations of visual-auditory fusion areas

被引:27
|
作者
Godfroy, M [1 ]
Roumes, C [1 ]
Dauchy, P [1 ]
机构
[1] IMASSA, Dept Cognit Sci, F-91223 Bretigny sur Orge, France
关键词
D O I
10.1068/p3344
中图分类号
R77 [眼科学];
学科分类号
100212 ;
摘要
The tolerance to spatial disparity between two synchronous visual and auditory components of a bimodal stimulus has been investigated in order to assess their respective contributions to perceptual fusion. The visual and auditory systems each have specific information-processing mechanisms, and provide different cues for scene perception, with the respective dominance of space for vision and of time for hearing. A broadband noise burst and a spot of light, 500 ms in duration, have been simultaneously presented to participants who had to judge whether these cues referred to a single spatial event. We examined the influence of (i) the range and the direction of spatial disparity between the visual and auditory components of a stimulation and (ii) the eccentricity of the bimodal stimulus in the observer's perceptual field. Size and shape properties of visual-auditory fusion areas have been determined in two dimensions. The greater the eccentricity within the perceptual field, the greater the dimension of these areas; however, this increase in size also depends on whether the direction of the disparity is vertical or horizontal. Furthermore, the relative location of visual and auditory signals significantly modifies the perception of unity in the vertical plane. The shape of the fusion areas, their variation in the field, and the perceptual result associated with the relative location of the visual and auditory components of the stimulus, concur towards a strong contribution of audition to visual-auditory fusion. The spatial ambiguity of the localisation capabilities of the auditory system may play a more essential role than accurate visual resolution in determining fusion.
引用
收藏
页码:1233 / 1245
页数:13
相关论文
共 50 条
  • [31] Experimental formation of visual-auditory associations on phoneme recognition
    Aleksandrov A.A.
    Dmitrieva E.S.
    Stankevich L.N.
    Neuroscience and Behavioral Physiology, 2010, 40 (9) : 998 - 1002
  • [32] Mechanisms of Visual-auditory Temporal Processing for Artificial Intelligence
    Yang, Jingjing
    Li, Qi
    Li, Xiujun
    Wu, Jinglong
    2014 7TH INTERNATIONAL CONFERENCE ON BIOMEDICAL ENGINEERING AND INFORMATICS (BMEI 2014), 2014, : 724 - 728
  • [33] Implicit perceptual learning of visual-auditory modality sequences
    Koch, Iring
    Blotenberg, Iris
    Fedosejew, Viktoria
    Stephan, Denise N.
    ACTA PSYCHOLOGICA, 2020, 202
  • [34] Visual-auditory interactions on explicit and implicit information processing
    Lo, L. Y.
    Lai, C. C.
    COGNITIVE PROCESSING, 2022, 23 (02) : 179 - 189
  • [35] The Effect of Semantic Congruence for Visual-auditory Bimodal Stimuli
    An, Xingwei
    Cao, Yong
    Wei, Jinwen
    Liu, Shuang
    Jiao, Xuejun
    Ming, Dong
    2017 39TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC), 2017, : 998 - 1001
  • [36] Visual-auditory integration for visual search: a behavioral study in barn owls
    Hazan, Yael
    Kra, Yonatan
    Yarin, Inna
    Wagner, Hermann
    Gutfreund, Yoram
    FRONTIERS IN INTEGRATIVE NEUROSCIENCE, 2015, 9 : 1 - 12
  • [37] Visual-auditory learning network for construction equipment action detection
    Jung, Seunghoon
    Jeoung, Jaewon
    Lee, Dong-Eun
    Jang, Hyounseung
    Hong, Taehoon
    COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, 2023, 38 (14) : 1916 - 1934
  • [38] Visual-auditory interaction in saccadic reaction time: Effects of auditory masker level
    Steenken, Rike
    Colonius, Hans
    Diederich, Adele
    Rach, Stefan
    BRAIN RESEARCH, 2008, 1220 : 150 - 156
  • [39] Automatic auditory change detection in humans is influenced by visual-auditory associative learning
    Laine, Matti
    Kwon, Myoung Soo
    Hamalainen, Heikki
    NEUROREPORT, 2007, 18 (16) : 1697 - 1701
  • [40] Predictive coding of visual-auditory and motor-auditory events: An electrophysiological study
    Stekelenburg, Jeroen J.
    Vroomen, Jean
    BRAIN RESEARCH, 2015, 1626 : 88 - 96