Similar and distinct neural mechanisms of visual and auditory emotion perception

被引:0
作者
Zhang, Yao [1 ,2 ]
Zhang, Fan [1 ,2 ]
Bi, Taiyong [1 ,2 ,3 ]
Qiu, Jiang [1 ,2 ]
机构
[1] Southwest Univ, Fac Psychol, Chongqing 400715, Peoples R China
[2] Minist Educ, Key Lab Cognit & Personal, Chongqing 400715, Peoples R China
[3] Zunyi Med Univ, Sch Management, Zunyi 563000, Guizhou, Peoples R China
来源
CHINESE SCIENCE BULLETIN-CHINESE | 2019年 / 64卷 / 07期
关键词
functional magnetic resonance imaging; visual modality; auditory modality; emotion perception; neural mechanism; multisensory cortex; SUPERIOR TEMPORAL SULCUS; HUMAN EXTRASTRIATE CORTEX; FUSIFORM FACE AREA; FACIAL EXPRESSIONS; INTEGRATION; RECOGNITION; PROSODY; ORGANIZATION; INFORMATION; IDENTITY;
D O I
10.1360/N972018-00721
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Emotion is a rapidly changing psychological and physical phenomenon. In daily life, people need to use the information of various sensory modalities (visual, auditory, etc.) to perceive non-verbal emotional information. Non-verbal emotion from faces and voices are often complex and varied. Previous studies have revealed that there are common and distinct neural networks underlying perception of human faces and voices. However, the neural mechanisms underlying visual and auditory emotional perception have not been well studied. Furthermore, despite researches on audiovisual integration of cross-modal emotional information, the multisensory cortex of the visual and auditory emotional information remains elusive. Therefore, it is necessary to study the similarities and differences between the neural mechanisms of emotion perception in visual and auditory modalities, and to explore the multiple sensory cortex of cross-modal emotion perception. The present fMRI (functional magnetic resonance imaging) study adopted a 2*3 (stimulus presentation modality: Visual, auditory; emotional valence: Happy, sad, fear) event-related design to investigate the neural mechanisms of emotion perception in visual and auditory modalities. When the stimulus (an emotional face or voice) was visually or aurally presented, participants were required to make a gender judgement. The results showed that the activation intensity of emotional faces in V1-V4, bilateral fusiform gyrus and bilateral superior temporal sulcus (STS) was significantly higher than that of emotional voices. Conversely, the activation intensity of emotional voices in auditory cortex (AC) was significantly higher than that of emotional faces. The results from multivoxel pattern analysis (MVPA) showed that the activation patterns of the right STS could discriminate the perception of human faces with emotional valence (happy, sad and fear face), indicating that the rSTG plays important role in perception of faces with different emotional valence; the activation patterns of the right FFA were different for happy and sad faces, indicating that the rFFA is crucial for positive and negative emotional face perception. A voxel-based whole brain analysis was further performed to examine the cortical areas that modulated perception of emotional valence. The whole brain analysis showed that the main effects for emotional valence was significant in the left opercular part of inferior frontal gyrus, indicating that this region might be a multisensory cortex of visual-auditory emotional perception. In summary, our study provided important evidence for further understanding the processing of emotion perception in different modalities and multisensory cortex of cross-channel emotion perception.
引用
收藏
页码:705 / 714
页数:10
相关论文
共 53 条
[1]   Underconnectivity of the superior temporal sulcus predicts emotion recognition deficits in autism [J].
Alaerts, Kaat ;
Woolley, Daniel G. ;
Steyaert, Jean ;
Di Martino, Adriana ;
Swinnen, Stephan P. ;
Wenderoth, Nicole .
SOCIAL COGNITIVE AND AFFECTIVE NEUROSCIENCE, 2014, 9 (10) :1589-1600
[2]   Modulation of the face- and body-selective visual regions by the motion and emotion of point-light face and body stimuli [J].
Atkinson, Anthony P. ;
Vuong, Quoc C. ;
Smithson, Hannah E. .
NEUROIMAGE, 2012, 59 (02) :1700-1712
[3]   Integration of visual and auditory information by superior temporal sulcus neurons responsive to the sight of actions [J].
Barraclough, NE ;
Xiao, DK ;
Baker, CI ;
Oram, MW ;
Perrett, DI .
JOURNAL OF COGNITIVE NEUROSCIENCE, 2005, 17 (03) :377-391
[4]   See me, hear me, touch me: multisensory integration in lateral occipital-temporal cortex [J].
Beauchamp, MS .
CURRENT OPINION IN NEUROBIOLOGY, 2005, 15 (02) :145-153
[5]   Integration of auditory and visual information about objects in superior temporal sulcus [J].
Beauchamp, MS ;
Lee, KE ;
Argall, BD ;
Martin, A .
NEURON, 2004, 41 (05) :809-823
[6]   Dissociable neural responses to facial expressions of sadness and anger [J].
Blair, RJR ;
Morris, JS ;
Frith, CD ;
Perrett, DI ;
Dolan, RJ .
BRAIN, 1999, 122 :883-893
[7]   Response and habituation of the human amygdala during visual processing of facial expression [J].
Breiter, HC ;
Etcoff, NL ;
Whalen, PJ ;
Kennedy, WA ;
Rauch, SL ;
Buckner, RL ;
Strauss, MM ;
Hyman, SE ;
Rosen, BR .
NEURON, 1996, 17 (05) :875-887
[8]   VISUAL PROPERTIES OF NEURONS IN A POLYSENSORY AREA IN SUPERIOR TEMPORAL SULCUS OF THE MACAQUE [J].
BRUCE, C ;
DESIMONE, R ;
GROSS, CG .
JOURNAL OF NEUROPHYSIOLOGY, 1981, 46 (02) :369-384
[9]   Recognition of emotional prosody and verbal components of spoken language:: an fMRI study [J].
Buchanan, TW ;
Lutz, K ;
Mirzazade, S ;
Specht, K ;
Shah, NJ ;
Zilles, K ;
Jäncke, L .
COGNITIVE BRAIN RESEARCH, 2000, 9 (03) :227-238
[10]   Understanding the recognition of facial identity and facial expression [J].
Calder, AJ ;
Young, AW .
NATURE REVIEWS NEUROSCIENCE, 2005, 6 (08) :641-651