Electrophysiological correlates of facial configuration and audio-visual congruency: evidence that face processing is a visual rather than a multisensory task

被引:3
作者
Proctor, Barnaby J. [1 ]
Meyer, Georg F. [1 ]
机构
[1] Univ Liverpool, Liverpool L69 3BX, Merseyside, England
关键词
Face recognition; EEG; Multisensory; Congruence; Auditory; Visual; INTEGRATION; RECOGNITION; SPEECH; IDENTIFICATION; PERCEPTION; COMPONENTS; FUSIFORM; HUMANS; CORTEX; AREA;
D O I
10.1007/s00221-011-2724-x
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
Behavioural, neuroimaging and lesion studies show that face processing has a special role in human perception. The purpose of this EEG study was to explore whether auditory information influences visual face perception. We employed a 2 x 2 factorial design and presented subjects with visual stimuli that could be cartoon faces or scrambled faces where size changes of one of the components, the mouth in the face condition, was either congruent or incongruent with the amplitude modulation of a simultaneously presented auditory signal. Our data show a significant main effect for signal congruence at an ERP peak around 135 ms and a significant main effect of face configuration at around 200 ms. The timing and scalp topology of both effects corresponds well to previously reported data on the integration of non-redundant audio-visual stimuli and face-selective processing. Our analysis did not show any significant statistical interactions. This double disassociation suggests that the early component, at 135 ms, is sensitive to auditory-visual congruency but not to facial configuration and that the later component is sensitive to facial configuration but not to AV congruency. We conclude that facial configurational processing is not influenced by the congruence of simultaneous auditory signals and is independent from featural processing where we see evidence for multisensory integration.
引用
收藏
页码:203 / 211
页数:9
相关论文
共 40 条
[21]   Statistical control of artifacts in dense array EEG/MEG studies [J].
Junghöfer, M ;
Elbert, T ;
Tucker, DM ;
Rockstroh, B .
PSYCHOPHYSIOLOGY, 2000, 37 (04) :523-532
[22]  
Kanwisher N, 1997, J NEUROSCI, V17, P4302
[23]   Electrophysiological indicators of phonetic and non-phonetic multisensory interactions during audiovisual speech perception [J].
Klucharev, V ;
Möttönen, R ;
Sams, M .
COGNITIVE BRAIN RESEARCH, 2003, 18 (01) :65-75
[24]   Semantic congruence is a critical factor in multisensory behavioral performance [J].
Laurienti, PJ ;
Kraft, RA ;
Maldjian, JA ;
Burdette, JH ;
Wallace, MT .
EXPERIMENTAL BRAIN RESEARCH, 2004, 158 (04) :405-414
[25]   REFERENCE-FREE IDENTIFICATION OF COMPONENTS OF CHECKERBOARD-EVOKED MULTICHANNEL POTENTIAL FIELDS [J].
LEHMANN, D ;
SKRANDIES, W .
ELECTROENCEPHALOGRAPHY AND CLINICAL NEUROPHYSIOLOGY, 1980, 48 (06) :609-621
[26]   Cross-modal perceptual integration of spatially and temporally disparate auditory and visual stimuli [J].
Lewald, J ;
Guski, R .
COGNITIVE BRAIN RESEARCH, 2003, 16 (03) :468-478
[27]   Low-level integration of auditory and visual motion signals requires spatial co-localisation [J].
Meyer, GF ;
Wuerger, SM ;
Röhrbein, F ;
Zetzsche, C .
EXPERIMENTAL BRAIN RESEARCH, 2005, 166 (3-4) :538-547
[28]  
MEYER GF, 2011, J COG NEURO IN PRESS
[29]   Multisensory auditory-visual interactions during early sensory processing in humans: a high-density electrical mapping study [J].
Molholm, S ;
Ritter, W ;
Murray, MM ;
Javitt, DC ;
Schroeder, CE ;
Foxe, JJ .
COGNITIVE BRAIN RESEARCH, 2002, 14 (01) :115-128
[30]   Electrophysiological correlates of recollecting faces of known and unknown individuals [J].
Paller, KA ;
Gonsalves, B ;
Grabowecky, M ;
Bozic, VS ;
Yamada, S .
NEUROIMAGE, 2000, 11 (02) :98-110