Identifying Core Affect in Individuals from fMRI Responses to Dynamic Naturalistic Audiovisual Stimuli

被引:27
作者
Kim, Jongwan [1 ]
Wang, Jing [2 ]
Wedell, Douglas H. [1 ]
Shinkareva, Svetlana V. [1 ]
机构
[1] Univ South Carolina, Dept Psychol, Columbia, SC 29208 USA
[2] Carnegie Mellon Univ, Dept Psychol, Pittsburgh, PA 15213 USA
来源
PLOS ONE | 2016年 / 11卷 / 09期
关键词
MULTISENSORY INTEGRATION; NEURAL REPRESENTATIONS; PATTERN-CLASSIFICATION; FACIAL EXPRESSIONS; EMOTIONAL AROUSAL; MUSICAL EMOTIONS; VALENCE; ACTIVATION; BRAIN; PERCEPTION;
D O I
10.1371/journal.pone.0161589
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Recent research has demonstrated that affective states elicited by viewing pictures varying in valence and arousal are identifiable from whole brain activation patterns observed with functional magnetic resonance imaging (fMRI). Identification of affective states from more naturalistic stimuli has clinical relevance, but the feasibility of identifying these states on an individual trial basis from fMRI data elicited by dynamic multimodal stimuli is unclear. The goal of this study was to determine whether affective states can be similarly identified when participants view dynamic naturalistic audiovisual stimuli. Eleven participants viewed 5s audiovisual clips in a passive viewing task in the scanner. Valence and arousal for individual trials were identified both within and across participants based on distributed patterns of activity in areas selectively responsive to audiovisual naturalistic stimuli while controlling for lower level features of the stimuli. In addition, the brain regions identified by searchlight analyses to represent valence and arousal were consistent with previously identified regions associated with emotion processing. These findings extend previous results on the distributed representation of affect to multimodal dynamic stimuli.
引用
收藏
页数:21
相关论文
共 85 条
  • [1] Abdi H, 2012, WILEY INTERDISCIPLIN, DOI DOI 10.1016/J.NEUROIMAGE.2010.10.011
  • [2] Deciphering the enigmatic face - The importance of facial dynamics in interpreting subtle facial expressions
    Ambadar, Z
    Schooler, JW
    Cohn, JF
    [J]. PSYCHOLOGICAL SCIENCE, 2005, 16 (05) : 403 - 410
  • [3] The human amygdala is sensitive to the valence of pictures and sounds irrespective of arousal: an fMRI study
    Anders, Silke
    Eippert, Falk
    Weiskopf, Nikolaus
    Veit, Ralf
    [J]. SOCIAL COGNITIVE AND AFFECTIVE NEUROSCIENCE, 2008, 3 (03) : 233 - 243
  • [4] Dissociated neural representations of intensity and valence in human olfaction
    Anderson, AK
    Christoff, K
    Stappen, I
    Panitz, D
    Ghahremani, DG
    Glover, G
    Gabrieli, JDE
    Sobel, N
    [J]. NATURE NEUROSCIENCE, 2003, 6 (02) : 196 - 202
  • [5] [Anonymous], 1999, International affective digitized sounds (IADS): Stimuli, instruction manual and affective ratings
  • [6] Emotion perception from dynamic and static body expressions in point-light and full-light displays
    Atkinson, AP
    Dittrich, WH
    Gemmell, AJ
    Young, AW
    [J]. PERCEPTION, 2004, 33 (06) : 717 - 746
  • [7] Context in Emotion Perception
    Barrett, Lisa Feldman
    Mesquita, Batja
    Gendron, Maria
    [J]. CURRENT DIRECTIONS IN PSYCHOLOGICAL SCIENCE, 2011, 20 (05) : 286 - 290
  • [8] Decoding the neural representation of affective states
    Baucom, Laura B.
    Wedell, Douglas H.
    Wang, Jing
    Blitzer, David N.
    Shinkareva, Svetlana V.
    [J]. NEUROIMAGE, 2012, 59 (01) : 718 - 727
  • [9] Integration of auditory and visual information about objects in superior temporal sulcus
    Beauchamp, MS
    Lee, KE
    Argall, BD
    Martin, A
    [J]. NEURON, 2004, 41 (05) : 809 - 823
  • [10] Bertelson P., 2004, CROSSMODAL SPACE CRO, P141, DOI DOI 10.1093/ACPROF:OSO/9780198524861.003.0007