Cortical operational synchrony during audio-visual speech integration

被引:47
|
作者
Fingelkurts, AA
Fingelkurts, AA
Krause, CM
Möttönen, R
Sams, M
机构
[1] Moscow MV Lomonosov State Univ, Human Physiol Dept, Human Brain Res Grp, Moscow 119899, Russia
[2] BM Sci Brain & Mind Technol Res Ctr, FI-02601 Espoo, Finland
[3] Univ Helsinki, Cognit Sci Dept Psychol, FIN-00014 Helsinki, Finland
[4] Aalto Univ, Lab Computat Engn, Helsinki 02015, Finland
关键词
multisensory integration; crossmodal; audio-visual; synchronization; operations; large-scale networks; MEG;
D O I
10.1016/S0093-934X(03)00059-2
中图分类号
R36 [病理学]; R76 [耳鼻咽喉科学];
学科分类号
100104 ; 100213 ;
摘要
Information from different sensory modalities is processed in different cortical regions. However, our daily perception is based on the overall impression resulting from the integration of information from multiple sensory modalities. At present it is not known how the human brain integrates information from different modalities into a unified percept. Using a robust phenomenon known as the McGurk effect it was shown in the present study that audio-visual synthesis takes place within a distributed and dynamic cortical networks with emergent properties. Various cortical sites within these networks interact with each other by means of so-called operational synchrony (Kaplan, Fingelkurts, Fingelkurts, & Darkhovsky, 1997). The temporal synchronization of cortical operations processing unimodal stimuli at different cortical sites reveals the importance of the temporal features of auditory and visual stimuli for audio-visual speech integration. (C) 2003 Elsevier Science (USA). All rights reserved.
引用
收藏
页码:297 / 312
页数:16
相关论文
共 50 条
  • [1] Cortical integration of audio-visual speech and non-speech stimuli
    Wyk, Brent C. Vander
    Ramsay, Gordon J.
    Hudac, Caitlin M.
    Jones, Warren
    Lin, David
    Klin, Ami
    Lee, Su Mei
    Pelphrey, Kevin A.
    BRAIN AND COGNITION, 2010, 74 (02) : 97 - 106
  • [2] Infant Perception of Audio-Visual Speech Synchrony
    Lewkowicz, David J.
    DEVELOPMENTAL PSYCHOLOGY, 2010, 46 (01) : 66 - 77
  • [3] Synchrony of audio-visual speech stimuli modulates left superior temporal sulcus
    Balk, Marja H.
    Ojanen, Ville
    Pekkola, Johanna
    Autti, Taina
    Sams, Mikko
    Jaaskelainen, Iiro P.
    NEUROREPORT, 2010, 21 (12) : 822 - 826
  • [4] Atypical audio-visual neural synchrony and speech processing in early autism
    Wang, Xiaoyue
    Bouton, Sophie
    Kojovic, Nada
    Giraud, Anne-Lise
    Schaer, Marie
    JOURNAL OF NEURODEVELOPMENTAL DISORDERS, 2025, 17 (01)
  • [5] Retinotopic effects during spatial audio-visual integration
    Meienbrock, A.
    Naumer, M. J.
    Doehrmann, O.
    Singer, W.
    Muckli, L.
    NEUROPSYCHOLOGIA, 2007, 45 (03) : 531 - 539
  • [6] Statistical multimodal integration for audio-visual speech processing
    Nakamura, S
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2002, 13 (04): : 854 - 866
  • [7] On the 'visual' in 'audio-visual integration': a hypothesis concerning visual pathways
    Jaekl, Philip
    Perez-Bellido, Alexis
    Soto-Faraco, Salvador
    EXPERIMENTAL BRAIN RESEARCH, 2014, 232 (06) : 1631 - 1638
  • [8] Effects of audio-visual integration on the detection of masked speech and non-speech sounds
    Eramudugolla, Ranmalee
    Henderson, Rachel
    Mattingley, Jason B.
    BRAIN AND COGNITION, 2011, 75 (01) : 60 - 66
  • [9] Audio-visual integration during overt visual attention
    Quigley, Cliodhna
    Onat, Selim
    Harding, Sue
    Cooke, Martin
    Koenig, Peter
    JOURNAL OF EYE MOVEMENT RESEARCH, 2007, 1 (02):
  • [10] Visual limitations shape audio-visual integration
    Perez-Bellido, Alexis
    Ernst, Marc O.
    Soto-Faraco, Salvador
    Lopez-Moliner, Joan
    JOURNAL OF VISION, 2015, 15 (14):