Audio-visual spatial alignment improves integration in the presence of a competing audio-visual stimulus

被引:17
|
作者
Fleming, Justin T. [1 ]
Noyce, Abigail L. [2 ]
Shinn-Cunningham, Barbara G. [3 ]
机构
[1] Harvard Med Sch, Div Med Sci, Speech & Hearing Biosci & Technol Program, Boston, MA USA
[2] Boston Univ, Dept Psychol & Brain Sci, Boston, MA 02215 USA
[3] Carnegie Mellon Univ, Neurosci Inst, Pittsburgh, PA 15213 USA
关键词
Audio-visual integration; Attention; Visual search; Electroencephalography; Temporal coherence; Spatial alignment; AUDITORY-VISUAL INTERACTIONS; EVENT-RELATED POTENTIALS; TEMPORAL BINDING WINDOW; MULTISENSORY INTEGRATION; SELECTIVE ATTENTION; INDIVIDUAL-DIFFERENCES; REACTION-TIME; CORTEX; HUMANS; RESPONSES;
D O I
10.1016/j.neuropsychologia.2020.107530
中图分类号
B84 [心理学]; C [社会科学总论]; Q98 [人类学];
学科分类号
03 ; 0303 ; 030303 ; 04 ; 0402 ;
摘要
In order to parse the world around us, we must constantly determine which sensory inputs arise from the same physical source and should therefore be perceptually integrated. Temporal coherence between auditory and visual stimuli drives audio-visual (AV) integration, but the role played by AV spatial alignment is less well understood. Here, we manipulated AV spatial alignment and collected electroencephalography (EEG) data while human subjects performed a free-field variant of the "pip and pop" AV search task. In this paradigm, visual search is aided by a spatially uninformative auditory tone, the onsets of which are synchronized to changes in the visual target. In Experiment 1, tones were either spatially aligned or spatially misaligned with the visual display. Regardless of AV spatial alignment, we replicated the key pip and pop result of improved AV search times. Mirroring the behavioral results, we found an enhancement of early event-related potentials (ERPs), particularly the auditory N1 component, in both AV conditions. We demonstrate that both top-down and bottom-up attention contribute to these N1 enhancements. In Experiment 2, we tested whether spatial alignment influences AV integration in a more challenging context with competing multisensory stimuli. An AV foil was added that visually resembled the target and was synchronized to its own stream of synchronous tones. The visual components of the AV target and AV foil occurred in opposite hemifields; the two auditory components were also in opposite hemifields and were either spatially aligned or spatially misaligned with the visual components to which they were synchronized. Search was fastest when the auditory and visual components of the AV target (and the foil) were spatially aligned. Attention modulated ERPs in both spatial conditions, but importantly, the scalp topography of early evoked responses shifted only when stimulus components were spatially aligned, signaling the recruitment of different neural generators likely related to multisensory integration. These results suggest that AV integration depends on AV spatial alignment when stimuli in both modalities compete for selective integration, a common scenario in real-world perception.
引用
收藏
页数:17
相关论文
共 50 条
  • [31] Preattentive processing of audio-visual emotional signals
    Foecker, Julia
    Gondan, Matthias
    Roeder, Brigitte
    ACTA PSYCHOLOGICA, 2011, 137 (01) : 36 - 47
  • [32] The audio-visual integration effect on music emotion: Behavioral and physiological evidence
    Pan, Fada
    Zhang, Li
    Ou, Yuhong
    Zhang, Xinni
    PLOS ONE, 2019, 14 (05):
  • [33] Audio-Visual Integration in Nonverbal or Minimally Verbal Young Autistic Children
    Kissine, Mikhail
    Bertels, Julie
    Deconinck, Nicolas
    Passeri, Gianfranco
    Deliens, Gaetane
    JOURNAL OF EXPERIMENTAL PSYCHOLOGY-GENERAL, 2021, 150 (10) : 2137 - 2157
  • [34] Cortical operational synchrony during audio-visual speech integration
    Fingelkurts, AA
    Fingelkurts, AA
    Krause, CM
    Möttönen, R
    Sams, M
    BRAIN AND LANGUAGE, 2003, 85 (02) : 297 - 312
  • [35] Age-Related Shifts in Theta Oscillatory Activity During Audio-Visual Integration Regardless of Visual Attentional Load
    Ren, Yanna
    Li, Shengnan
    Wang, Tao
    Yang, Weiping
    FRONTIERS IN AGING NEUROSCIENCE, 2020, 12
  • [36] Probabilistic speaker localization in noisy enviromments by audio-visual integration
    Choi, Jong-Suk
    Kim, Munsang
    Kim, Hyun-Don
    2006 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, VOLS 1-12, 2006, : 4704 - +
  • [37] The effect of auditory stimuli in audio-visual two-source integration
    Jia, Weixian
    Shi, Li
    2019 11TH INTERNATIONAL CONFERENCE ON INTELLIGENT HUMAN-MACHINE SYSTEMS AND CYBERNETICS (IHMSC 2019), VOL 1, 2019, : 145 - 148
  • [38] Research on Audio-Visual Integration Based on Architectural Acoustic Simulation
    Sun H.
    Yang Y.
    Huanan Ligong Daxue Xuebao/Journal of South China University of Technology (Natural Science), 2023, 51 (04): : 71 - 79
  • [39] Development of a Bayesian Estimator for Audio-Visual Integration: A Neurocomputational Study
    Ursino, Mauro
    Crisafulli, Andrea
    di Pellegrino, Giuseppe
    Magosso, Elisa
    Cuppini, Cristiano
    FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2017, 11
  • [40] Audio-visual integration in noise: Influence of auditory and visual stimulus degradation on eye movements and perception of the McGurk effect
    Stacey, Jemaine E.
    Howard, Christina J.
    Mitra, Suvobrata
    Stacey, Paula C.
    ATTENTION PERCEPTION & PSYCHOPHYSICS, 2020, 82 (07) : 3544 - 3557