Catching audiovisual mice: Predicting the arrival time of auditory-visual motion signals

被引:19
作者
Hofbauer, M. [1 ]
Wuerger, S. M. [2 ]
Meyer, G. F. [2 ]
Roehrbein, F. [3 ]
Schill, K. [3 ]
Zetzsche, C. [1 ,3 ]
机构
[1] Univ Munich, D-81377 Munich, Germany
[2] Univ Liverpool, Liverpool L69 3BX, Merseyside, England
[3] Univ Bremen, D-28359 Bremen, Germany
关键词
D O I
10.3758/CABN.4.2.241
中图分类号
B84 [心理学]; C [社会科学总论]; Q98 [人类学];
学科分类号
03 ; 0303 ; 030303 ; 04 ; 0402 ;
摘要
We investigated the extent to which auditory and visual motion signals are combined when observers are asked to predict the location of a virtually moving target. In Condition 1, the unimodal and bimodal signals were noisy, but the target object was continuously visible and audible; in Condition 2, the virtually moving object was hidden (invisible and inaudible) for a short period prior to its arrival at the target location. Our main finding was that the facilitation due to simultaneous visual and auditory input is very different for the two conditions. When the target is continuously visible and audible (Condition 1), the bimodal performance is twice as good as the unimodal performances, thus suggesting a very effective integration mechanism. On the other hand, if the object is hidden for a short period (Condition 2) and the task therefore requires the extrapolation of motion speed over a temporal and spatial period, the facilitation due to both sensory inputs is almost absent, and the bimodal performance is limited by the visual performance.
引用
收藏
页码:241 / 250
页数:10
相关论文
共 32 条
  • [1] Two stages in crossmodal saccadic integration: evidence from a visual-auditory focused attention task
    Arndt, PA
    Colonius, H
    [J]. EXPERIMENTAL BRAIN RESEARCH, 2003, 150 (04) : 417 - 426
  • [2] Aspell JE, 2000, PERCEPTION, V29, P74
  • [3] Polymodal motion processing in posterior parietal and promotor cortex: A human fMRI study strongly implies equivalencies between humans and monkeys
    Bremmer, F
    Schlack, A
    Shah, NJ
    Zafiris, O
    Kubischik, M
    Hoffmann, KP
    Zilles, K
    Fink, GR
    [J]. NEURON, 2001, 29 (01) : 287 - 296
  • [4] Visual perception - Predicting the present
    Cavanagh, P
    [J]. NATURE, 1997, 386 (6620) : 19 - &
  • [5] A two-stage model for visual-auditory interaction in saccadic latencies
    Colonius, H
    Arndt, P
    [J]. PERCEPTION & PSYCHOPHYSICS, 2001, 63 (01): : 126 - 147
  • [6] Auditory-visual interactions subserving goal-directed saccades in a complex scene
    Corneil, BD
    Van Wanrooij, M
    Munoz, DP
    Van Opstal, AJ
    [J]. JOURNAL OF NEUROPHYSIOLOGY, 2002, 88 (01) : 438 - 454
  • [7] Corneil BD, 1996, J NEUROSCI, V16, P8193
  • [8] Visual-auditory interactions modulate saccade-related activity in monkey superior colliculus
    Frens, MA
    Van Opstal, AJ
    [J]. BRAIN RESEARCH BULLETIN, 1998, 46 (03) : 211 - 224
  • [9] SPATIAL AND TEMPORAL FACTORS DETERMINE AUDITORY-VISUAL INTERACTIONS IN HUMAN SACCADIC EYE-MOVEMENTS
    FRENS, MA
    VANOPSTAL, AJ
    VANDERWILLIGEN, RF
    [J]. PERCEPTION & PSYCHOPHYSICS, 1995, 57 (06): : 802 - 816
  • [10] ON THE NATURE OF INTERSENSORY FACILITATION OF REACTION-TIME
    GIELEN, SCAM
    SCHMIDT, RA
    VANDENHEUVEL, PJM
    [J]. PERCEPTION & PSYCHOPHYSICS, 1983, 34 (02): : 161 - 168