Audio-visual spatial alignment improves integration in the presence of a competing audio-visual stimulus

被引:17
|
作者
Fleming, Justin T. [1 ]
Noyce, Abigail L. [2 ]
Shinn-Cunningham, Barbara G. [3 ]
机构
[1] Harvard Med Sch, Div Med Sci, Speech & Hearing Biosci & Technol Program, Boston, MA USA
[2] Boston Univ, Dept Psychol & Brain Sci, Boston, MA 02215 USA
[3] Carnegie Mellon Univ, Neurosci Inst, Pittsburgh, PA 15213 USA
关键词
Audio-visual integration; Attention; Visual search; Electroencephalography; Temporal coherence; Spatial alignment; AUDITORY-VISUAL INTERACTIONS; EVENT-RELATED POTENTIALS; TEMPORAL BINDING WINDOW; MULTISENSORY INTEGRATION; SELECTIVE ATTENTION; INDIVIDUAL-DIFFERENCES; REACTION-TIME; CORTEX; HUMANS; RESPONSES;
D O I
10.1016/j.neuropsychologia.2020.107530
中图分类号
B84 [心理学]; C [社会科学总论]; Q98 [人类学];
学科分类号
03 ; 0303 ; 030303 ; 04 ; 0402 ;
摘要
In order to parse the world around us, we must constantly determine which sensory inputs arise from the same physical source and should therefore be perceptually integrated. Temporal coherence between auditory and visual stimuli drives audio-visual (AV) integration, but the role played by AV spatial alignment is less well understood. Here, we manipulated AV spatial alignment and collected electroencephalography (EEG) data while human subjects performed a free-field variant of the "pip and pop" AV search task. In this paradigm, visual search is aided by a spatially uninformative auditory tone, the onsets of which are synchronized to changes in the visual target. In Experiment 1, tones were either spatially aligned or spatially misaligned with the visual display. Regardless of AV spatial alignment, we replicated the key pip and pop result of improved AV search times. Mirroring the behavioral results, we found an enhancement of early event-related potentials (ERPs), particularly the auditory N1 component, in both AV conditions. We demonstrate that both top-down and bottom-up attention contribute to these N1 enhancements. In Experiment 2, we tested whether spatial alignment influences AV integration in a more challenging context with competing multisensory stimuli. An AV foil was added that visually resembled the target and was synchronized to its own stream of synchronous tones. The visual components of the AV target and AV foil occurred in opposite hemifields; the two auditory components were also in opposite hemifields and were either spatially aligned or spatially misaligned with the visual components to which they were synchronized. Search was fastest when the auditory and visual components of the AV target (and the foil) were spatially aligned. Attention modulated ERPs in both spatial conditions, but importantly, the scalp topography of early evoked responses shifted only when stimulus components were spatially aligned, signaling the recruitment of different neural generators likely related to multisensory integration. These results suggest that AV integration depends on AV spatial alignment when stimuli in both modalities compete for selective integration, a common scenario in real-world perception.
引用
收藏
页数:17
相关论文
共 50 条
  • [21] Audio-visual speech perception is special
    Tuomainen, J
    Andersen, TS
    Tiippana, K
    Sams, M
    COGNITION, 2005, 96 (01) : B13 - B22
  • [22] Audio-visual simultaneity judgments
    Massimiliano Zampini
    Steve Guest
    David I. Shore
    Charles Spence
    Perception & Psychophysics, 2005, 67 : 531 - 544
  • [23] Streaming Audio-Visual Speech Recognition with Alignment Regularization
    Ma, Pingchuan
    Moritz, Niko
    Petridis, Stavros
    Fuegen, Christian
    Pantic, Maja
    INTERSPEECH 2023, 2023, : 1598 - 1602
  • [24] Audio-visual stimulation improves oculomotor patterns in patients with hemianopia
    Passamonti, Claudia
    Bertini, Caterina
    Ladavas, Elisabetta
    NEUROPSYCHOLOGIA, 2009, 47 (02) : 546 - 555
  • [25] Audio-visual facilitation of the mu rhythm
    McGarry, Lucy M.
    Russo, Frank A.
    Schalles, Matt D.
    Pineda, Jaime A.
    EXPERIMENTAL BRAIN RESEARCH, 2012, 218 (04) : 527 - 538
  • [26] Optimality and Limitations of Audio-Visual Integration for Cognitive Systems
    Boyce, William Paul
    Lindsay, Anthony
    Zgonnikov, Arkady
    Rano, Inaki
    Wong-Lin, KongFatt
    FRONTIERS IN ROBOTICS AND AI, 2020, 7
  • [27] Neural dynamics driving audio-visual integration in autism
    Ronconi, Luca
    Vitale, Andrea
    Federici, Alessandra
    Mazzoni, Noemi
    Battaglini, Luca
    Molteni, Massimo
    Casartelli, Luca
    CEREBRAL CORTEX, 2023, 33 (03) : 543 - 556
  • [28] Audio-visual and olfactory-visual integration in healthy participants and subjects with autism spectrum disorder
    Stickel, Susanne
    Weismann, Pauline
    Kellermann, Thilo
    Regenbogen, Christina
    Habel, Ute
    Freiherr, Jessica
    Chechko, Natalya
    HUMAN BRAIN MAPPING, 2019, 40 (15) : 4470 - 4486
  • [29] Audio-Visual Salieny Network with Audio Attention Module
    Cheng, Shuaiyang
    Gao, Xing
    Song, Liang
    Xiahou, Jianbing
    PROCEEDINGS OF 2021 2ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND INFORMATION SYSTEMS (ICAIIS '21), 2021,
  • [30] Cortical Plasticity of Audio-Visual Object Representations
    Naumer, Marcus J.
    Doehrmann, Oliver
    Mueller, Notger G.
    Muckli, Lars
    Kaiser, Jochen
    Hein, Grit
    CEREBRAL CORTEX, 2009, 19 (07) : 1641 - 1653