Audio-visual facilitation of the mu rhythm

被引:25
|
作者
McGarry, Lucy M. [1 ]
Russo, Frank A. [1 ]
Schalles, Matt D. [2 ]
Pineda, Jaime A. [2 ,3 ]
机构
[1] Ryerson Univ, Dept Psychol, Toronto, ON M5B 2K3, Canada
[2] Univ Calif San Diego, Dept Cognit Sci, La Jolla, CA 92037 USA
[3] Univ Calif San Diego, Neurosci Grp, La Jolla, CA 92037 USA
基金
加拿大自然科学与工程研究理事会;
关键词
Mu rhythm; Mirror neuron system; Multimodal facilitation; Independent components analysis; GRASP REPRESENTATIONS; MOTOR FACILITATION; EEG; RECOGNITION; ACTIVATION; HUMANS; CORTEX; PERCEPTION; COMPONENT; PREMOTOR;
D O I
10.1007/s00221-012-3046-3
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
Previous studies demonstrate that perception of action presented audio-visually facilitates greater mirror neuron system (MNS) activity in humans (Kaplan and Iacoboni in Cogn Process 8(2):103-113, 2007) and non-human primates (Keysers et al. in Exp Brain Res 153(4):628-636, 2003) than perception of action presented unimodally. In the current study, we examined whether audio-visual facilitation of the MNS can be indexed using electroencephalography (EEG) measurement of the mu rhythm. The mu rhythm is an EEG oscillation with peaks at 10 and 20 Hz that is suppressed during the execution and perception of action and is speculated to reflect activity in the premotor and inferior parietal cortices as a result of MNS activation (Pineda in Behav Brain Funct 4(1):47, 2008). Participants observed experimental stimuli unimodally (visual-alone or audio-alone) or bimodally during randomized presentations of two hands ripping a sheet of paper, and a control video depicting a box moving up and down. Audio-visual perception of action stimuli led to greater event-related desynchrony (ERD) of the 8-13 Hz mu rhythm compared to unimodal perception of the same stimuli over the C3 electrode, as well as in a left central cluster when data were examined in source space. These results are consistent with Kaplan and Iacoboni's (in Cogn Process 8(2):103-113, 2007), findings that indicate audio-visual facilitation of the MNS; our left central cluster was localized approximately 13.89 mm away from the ventral premotor cluster identified in their fMRI study, suggesting that these clusters originate from similar sources. Consistency of results in electrode space and component space support the use of ICA as a valid source localization tool.
引用
收藏
页码:527 / 538
页数:12
相关论文
共 50 条
  • [41] Mu rhythm desynchronization by tongue thrust observation
    Sakihara, Kotoe
    Inagaki, Masumi
    FRONTIERS IN HUMAN NEUROSCIENCE, 2015, 9
  • [42] EEG Mu Rhythm in Typical and Atypical Development
    Bernier, Raphael
    Aaronson, Benjamin
    Kresse, Anna
    JOVE-JOURNAL OF VISUALIZED EXPERIMENTS, 2014, (86):
  • [43] Audio-visual interactions uniquely contribute to resolution of visual conflict in people possessing absolute pitch
    Kim, Sujin
    Blake, Randolph
    Lee, Minyoung
    Kim, Chai-Youn
    PLOS ONE, 2017, 12 (04):
  • [44] The 'Audio-Visual Face Cover Corpus': Investigations into audio-visual speech and speaker recognition when the speaker's face is occluded by facewear
    Fecher, Natalie
    13TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION 2012 (INTERSPEECH 2012), VOLS 1-3, 2012, : 2247 - 2250
  • [45] Audio-visual and olfactory-visual integration in healthy participants and subjects with autism spectrum disorder
    Stickel, Susanne
    Weismann, Pauline
    Kellermann, Thilo
    Regenbogen, Christina
    Habel, Ute
    Freiherr, Jessica
    Chechko, Natalya
    HUMAN BRAIN MAPPING, 2019, 40 (15) : 4470 - 4486
  • [46] Visual field differences in temporal synchrony processing for audio-visual stimuli
    Takeshima, Yasuhiro
    PLOS ONE, 2021, 16 (12):
  • [47] Perceived Audio-Visual Simultaneity Is Recalibrated by the Visual Intensity of the Preceding Trial
    Horsfall, Ryan
    Harrison, Neil
    Meyer, Georg
    Wuerger, Sophie
    MULTISENSORY RESEARCH, 2024, 37 (02) : 143 - 162
  • [48] The neural basis of visual dominance in the context of audio-visual object processing
    Schmid, Carmen
    Buechel, Christian
    Rose, Michael
    NEUROIMAGE, 2011, 55 (01) : 304 - 311
  • [49] Visualized voices: A case study of audio-visual synesthesia
    Fernay, Louise
    Reby, David
    Ward, Jamie
    NEUROCASE, 2012, 18 (01) : 50 - 56
  • [50] IMPROVING ACOUSTIC MODELING USING AUDIO-VISUAL SPEECH
    Abdelaziz, Ahmed Hussen
    2017 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2017, : 1081 - 1086