Cross-Channel Adaptation Reveals Shared Emotion Representation From Face and Biological Motion

被引:2
|
作者
Yuan, Tian
Wang, Li [1 ]
Jiang, Yi
机构
[1] Chinese Acad Sci, Inst Psychol, State Key Lab Brain & Cognit Sci, 16 Lincui Rd, Beijing 100101, Peoples R China
基金
中国国家自然科学基金;
关键词
adaptation; emotion perception; biological motion; face; SUPERIOR TEMPORAL SULCUS; FACIAL-EXPRESSION; NEURAL REPRESENTATIONS; SELECTIVE RESPONSES; VISUAL-ADAPTATION; COLOR ADAPTATION; FMRI-ADAPTATION; INVISIBLE FACES; BRAIN-AREAS; HAPPY FACES;
D O I
10.1037/emo0001409
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
Emotions in interpersonal interactions can be communicated simultaneously via various social signals such as face and biological motion (BM). Here, we demonstrate that even though BM and face are very different in visual properties, emotions conveyed by these two types of social signals involve dedicated and common processing mechanisms (N = 168, college students, 2020-2024). By utilizing the visual adaptation paradigm, we found that prolonged exposure to the happy BM biased the emotion perception of the subsequently presented morphed BM toward sad, and vice versus. The observed aftereffect disappeared when the BM adaptors were shown inverted, indicating that it arose from emotional information processing rather than being a result of adaptation to constitutive low-level features. Besides, such an aftereffect was also found for facial expressions and similarly vanished when the face adaptors were inverted. Critically, preexposure to emotional faces also exerted an adaptation aftereffect on the emotion perception of BMs. Furthermore, this cross-channel effect could not only happen from faces to BMs but also from BMs to faces, suggesting that emotion perception from face and BM are potentially driven by common underlying neural substrates. Overall, these findings highlighted a close coupling of BM and face emotion perception and suggested the existence of a dedicated emotional representation that can be shared across these two different types of social signals.
引用
收藏
页码:158 / 173
页数:16
相关论文
共 14 条
  • [1] Evidence for a supra-modal representation of emotion from cross-modal adaptation
    Pye, Annie
    Bestelmeyer, Patricia E. G.
    COGNITION, 2015, 134 : 245 - 251
  • [2] Hysteresis reveals a happiness bias effect in dynamic emotion recognition from ambiguous biological motion
    Cortes, Ana Borges
    Duarte, Joao Valente
    Castelo-Branco, Miguel
    JOURNAL OF VISION, 2023, 23 (13): : 5
  • [3] The neural representation of body orientation and emotion from biological motion
    Liu, Shuaicheng
    Yu, Lu
    Ren, Jie
    Zhang, Mingming
    Luo, Wenbo
    NEUROIMAGE, 2025, 310
  • [4] Social categories shape the neural representation of emotion: evidence from a visual face adaptation task
    Otten, Marte
    Banaji, Mahzarin R.
    FRONTIERS IN INTEGRATIVE NEUROSCIENCE, 2012, 6
  • [5] Prosody Dominates Over Semantics in Emotion Word Processing: Evidence From Cross-Channel and Cross-Modal Stroop Effects
    Lin, Yi
    Ding, Hongwei
    Zhang, Yang
    JOURNAL OF SPEECH LANGUAGE AND HEARING RESEARCH, 2020, 63 (03): : 896 - 912
  • [6] fMR-adaptation reveals invariant coding of biological motion on the human STS
    Grossman, Emily D.
    Jardine, Nicole L.
    Pyles, John A.
    FRONTIERS IN HUMAN NEUROSCIENCE, 2010, 4
  • [7] Adaptation aftereffects in the perception of gender from biological motion
    Troje, Nikolaus F.
    Sadr, Javid
    Geyer, Henning
    Nakayama, Ken
    JOURNAL OF VISION, 2006, 6 (08): : 850 - 857
  • [8] Local velocity representation: evidence from motion adaptation
    Schrater, PR
    Simoncelli, EP
    VISION RESEARCH, 1998, 38 (24) : 3899 - 3912
  • [9] Evaluating emotion from biological motion: The role of social competence
    daSilva, Elizabeth B.
    Jameel, Bushra
    Jaime, Mark
    AFFECTIVE SCIENCE, 2022, 3 (01)
  • [10] A motion capture library for the study of identity, gender, and emotion perception from biological motion
    Yingliang Ma
    Helena M. Paterson
    Frank E. Pollick
    Behavior Research Methods, 2006, 38 : 134 - 141