Modality-specific brain representations during automatic processing of face, voice and body expressions

被引:2
|
作者
Vaessen, Maarten [1 ]
van der Heijden, Kiki [2 ]
de Gelder, Beatrice [3 ]
机构
[1] Zuyd Univ Appl Sci, Maastricht, Netherlands
[2] Radboud Univ Nijmegen, Nijmegen, Netherlands
[3] Maastricht Univ, Maastricht, Netherlands
基金
欧洲研究理事会; 欧盟地平线“2020”;
关键词
multisensory affect; faces; voices; bodies; emotion perception; facial expressions; voice; EMOTIONAL EXPRESSIONS; PATTERN-ANALYSIS; PERCEPTION; COMMON; INTEGRATION; RESPONSES; CORTEX; SCENES;
D O I
10.3389/fnins.2023.1132088
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
A central question in affective science and one that is relevant for its clinical applications is how emotions provided by different stimuli are experienced and represented in the brain. Following the traditional view emotional signals are recognized with the help of emotion concepts that are typically used in descriptions of mental states and emotional experiences, irrespective of the sensory modality. This perspective motivated the search for abstract representations of emotions in the brain, shared across variations in stimulus type (face, body, voice) and sensory origin (visual, auditory). On the other hand, emotion signals like for example an aggressive gesture, trigger rapid automatic behavioral responses and this may take place before or independently of full abstract representation of the emotion. This pleads in favor specific emotion signals that may trigger rapid adaptative behavior only by mobilizing modality and stimulus specific brain representations without relying on higher order abstract emotion categories. To test this hypothesis, we presented participants with naturalistic dynamic emotion expressions of the face, the whole body, or the voice in a functional magnetic resonance (fMRI) study. To focus on automatic emotion processing and sidestep explicit concept-based emotion recognition, participants performed an unrelated target detection task presented in a different sensory modality than the stimulus. By using multivariate analyses to assess neural activity patterns in response to the different stimulus types, we reveal a stimulus category and modality specific brain organization of affective signals. Our findings are consistent with the notion that under ecological conditions emotion expressions of the face, body and voice may have different functional roles in triggering rapid adaptive behavior, even if when viewed from an abstract conceptual vantage point, they may all exemplify the same emotion. This has implications for a neuroethologically grounded emotion research program that should start from detailed behavioral observations of how face, body, and voice expressions function in naturalistic contexts.
引用
收藏
页数:13
相关论文
共 47 条
  • [21] Auditory and visual temporal sensitivity: evidence for a hierarchical structure of modality-specific and modality-independent levels of temporal information processing
    Stauffer, Corinne C.
    Haldemann, Judith
    Troche, Stefan J.
    Rammsayer, Thomas H.
    PSYCHOLOGICAL RESEARCH-PSYCHOLOGISCHE FORSCHUNG, 2012, 76 (01): : 20 - 31
  • [22] Astrocytes adjust the dynamic range of cortical network activity to control modality-specific sensory information processing
    Miguel-Quesada, Claudia
    Zaforas, Marta
    Herrera-Perez, Salvador
    Lines, Justin
    Fernandez-Lopez, Elena
    Alonso-Calvino, Elena
    Ardaya, Maria
    Soria, Federico N.
    Araque, Alfonso
    Aguilar, Juan
    Rosa, Juliana M.
    CELL REPORTS, 2023, 42 (08):
  • [23] Age-related differences in brain activity during implicit and explicit processing of fearful facial expressions
    Zsoldos, Isabella
    Cousin, Emilie
    Klein-Koerkamp, Yanica
    Pichat, Cedric
    Hot, Pascal
    BRAIN RESEARCH, 2016, 1650 : 208 - 217
  • [24] The processing of body expressions during emotional scenes: the modulation role of attachment styles
    Ma, Yuanxiao
    Chen, Xu
    Ran, Guangming
    Ma, Haijing
    Zhang, Xing
    Liu, Guangzeng
    SCIENTIFIC REPORTS, 2017, 7
  • [25] Emotion Recognition From Expressions in Face, Voice, and Body: The Multimodal Emotion Recognition Test (MERT)
    Baenziger, Tanja
    Grandjean, Didier
    Scherer, Klaus R.
    EMOTION, 2009, 9 (05) : 691 - 704
  • [26] An Investigation of the Relationship Between fMRI and ERP Source Localized Measurements of Brain Activity during Face Processing
    Corrigan, Neva M.
    Richards, Todd
    Webb, Sara Jane
    Murias, Michael
    Merkle, Kristen
    Kleinhans, Natalia M.
    Johnson, L. Clark
    Poliakov, Andrew
    Aylward, Elizabeth
    Dawson, Geraldine
    BRAIN TOPOGRAPHY, 2009, 22 (02) : 83 - 96
  • [27] ERP correlates of auditory processing during automatic correction of unexpected perturbations in voice auditory feedback
    Korzyukov, Oleg
    Karvelis, Laura
    Behroozmand, Roozbeh
    Larson, Charles R.
    INTERNATIONAL JOURNAL OF PSYCHOPHYSIOLOGY, 2012, 83 (01) : 71 - 78
  • [28] Exploring the Effects of Antisocial Personality Traits on Brain Potentials during Face Processing
    Pfabigan, Daniela M.
    Alexopoulos, Johanna
    Sailer, Uta
    PLOS ONE, 2012, 7 (11):
  • [29] Maintenance of Bodily Expressions Modulates Functional Connectivity Between Prefrontal Cortex and Extrastriate Body Area During Working Memory Processing
    Ren, Jie
    Zhang, Mingming
    Liu, Shuaicheng
    He, Weiqi
    Luo, Wenbo
    BRAIN SCIENCES, 2024, 14 (12)
  • [30] Distinct Yet Proximal Face- and Body-Selective Brain Regions Enable Clutter-Tolerant Representations of the Face, Body, and Whole Person
    Kliger, Libi
    Yovel, Galit
    JOURNAL OF NEUROSCIENCE, 2024, 44 (24)