Sensorimotor synchronization with audio-visual stimuli: limited multisensory integration

被引:0
|
作者
Alan Armstrong
Johann Issartel
机构
[1] Dublin City University,Multisensory Motor Learning Laboratory (M²L²), School of Health and Human Performance
来源
关键词
Sensorimotor synchronization; Multisensory; Audio-visual; Spatio-temporal; Anchoring;
D O I
暂无
中图分类号
学科分类号
摘要
Understanding how we synchronize our actions with stimuli from different sensory modalities plays a central role in helping to establish how we interact with our multisensory environment. Recent research has shown better performance with multisensory over unisensory stimuli; however, the type of stimuli used has mainly been auditory and tactile. The aim of this article was to expand our understanding of sensorimotor synchronization with multisensory audio-visual stimuli and compare these findings to their individual unisensory counterparts. This research also aims to assess the role of spatio-temporal structure for each sensory modality. The visual and/or auditory stimuli had either temporal or spatio-temporal information available and were presented to the participants in unimodal and bimodal conditions. Globally, the performance was significantly better for the bimodal compared to the unimodal conditions; however, this benefit was limited to only one of the bimodal conditions. In terms of the unimodal conditions, the level of synchronization with visual stimuli was better than auditory, and while there was an observed benefit with the spatio-temporal compared to temporal visual stimulus, this was not replicated with the auditory stimulus.
引用
收藏
页码:3453 / 3463
页数:10
相关论文
共 50 条
  • [1] Sensorimotor synchronization with audio-visual stimuli: limited multisensory integration
    Armstrong, Alan
    Issartel, Johann
    EXPERIMENTAL BRAIN RESEARCH, 2014, 232 (11) : 3453 - 3463
  • [2] Audio-visual saliency prediction with multisensory perception and integration
    Xie, Jiawei
    Liu, Zhi
    Li, Gongyang
    Song, Yingjie
    IMAGE AND VISION COMPUTING, 2024, 143
  • [3] Multisensory stimuli and memory: the effect of sensory conflict in memorizing audio-visual stimuli
    Blizkovska, Marie
    Groenewege, Brian
    Hoppener, Tirza
    PROCEEDINGS ICABR 2015: X. INTERNATIONAL CONFERENCE ON APPLIED BUSINESS RESEARCH, 2015, : 136 - 140
  • [4] MULTIMODAL SENSORIMOTOR INVESTIGATION OF AUDIO-VISUAL INTEGRATION IN COCHLEAR IMPLANT USERS
    Valentin, Olivier
    Foster, Nicolas E. V.
    Intartaglia, Bastien
    Prud’homme, Marie-Anne
    Schönwiesner, Marc
    Nozaradan, Sylvie
    Lehmann, Alexandre
    Canadian Acoustics - Acoustique Canadienne, 2024, 52 (01):
  • [5] The Capacity of Audio-Visual Integration Need not be Limited to One Item
    Dyson, Ben
    Wilbiks, Jonathan
    CANADIAN JOURNAL OF EXPERIMENTAL PSYCHOLOGY-REVUE CANADIENNE DE PSYCHOLOGIE EXPERIMENTALE, 2014, 68 (04): : 285 - 285
  • [6] Semantics and the multisensory brain: How meaning modulates processes of audio-visual integration
    Doehrmann, Oliver
    Naumer, Marcus J.
    BRAIN RESEARCH, 2008, 1242 : 136 - 150
  • [7] Resolving multisensory conflict: a strategy for balancing the costs and benefits of audio-visual integration
    Roach, Neil W.
    Heron, James
    McGraw, Paul V.
    PROCEEDINGS OF THE ROYAL SOCIETY B-BIOLOGICAL SCIENCES, 2006, 273 (1598) : 2159 - 2168
  • [8] Audio-visual multisensory training enhances visual processing of motion stimuli in healthy participants: an electrophysiological study
    Grasso, Paolo A.
    Benassi, Mariagrazia
    Ladavas, Elisabetta
    Bertini, Caterina
    EUROPEAN JOURNAL OF NEUROSCIENCE, 2016, 44 (10) : 2748 - 2758
  • [9] Cortical integration of audio-visual speech and non-speech stimuli
    Wyk, Brent C. Vander
    Ramsay, Gordon J.
    Hudac, Caitlin M.
    Jones, Warren
    Lin, David
    Klin, Ami
    Lee, Su Mei
    Pelphrey, Kevin A.
    BRAIN AND COGNITION, 2010, 74 (02) : 97 - 106
  • [10] The effect of auditory stimuli in audio-visual two-source integration
    Jia, Weixian
    Shi, Li
    2019 11TH INTERNATIONAL CONFERENCE ON INTELLIGENT HUMAN-MACHINE SYSTEMS AND CYBERNETICS (IHMSC 2019), VOL 1, 2019, : 145 - 148