Emotional synchrony and covariation of behavioral/physiological reactions between interlocutors

被引:0
作者
Arimoto, Yoshiko [1 ,2 ,3 ]
Okanoya, Kazuo [1 ,2 ,3 ]
机构
[1] Univ Tokyo, Grad Sch Arts & Sci, Shibuya Ku, Tokyo 1538902, Japan
[2] ERATO, JST, Okanoya Emot Informat Project, Wako, Saitama 3510198, Japan
[3] RIKEN, Brain Sci Inst, Wako, Saitama 3510198, Japan
来源
2014 17TH ORIENTAL CHAPTER OF THE INTERNATIONAL COMMITTEE FOR THE CO-ORDINATION AND STANDARDIZATION OF SPEECH DATABASES AND ASSESSMENT TECHNIQUES (COCOSDA) | 2014年
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Covariation of behavioral/physiological reactions may cause emotional synchrony between interlocutors. Based on this assumption, this paper investigated (1) how emotions between the interlocutors are synchronized during a dialog and (2) what types of behavioral or physiological reactions correlate with each other. Speaker's verbal and non-verbal emotional behavior (vocal/facial) and physiological responses (heart rate/skin conductance) were recorded when they engaged in competitive/cooperative tasks. After recording, the speakers annotated their own and their interlocutor's emotional states (arousal/valence/positivity). An analysis of variance test with correlation coefficients between emotional states suggested that male speakers were less emotionally synchronized than female speakers in the competitive dialog. It also suggested that they believed that their emotions would have been more synchronized with each other than they actually were. Moreover, the results of the correlation tests revealed that the behavioral or physiological reactions of most of the pairs in the same dialog were positively correlated.
引用
收藏
页数:6
相关论文
共 12 条
  • [1] DETERMINATION OF OPTIMAL THERMAL CONDITIONS FOR GROWTH OF CLAM (VENERUPIS-PULLASTRA) SEED
    ALBENTOSA, M
    BEIRAS, R
    CAMACHO, AP
    [J]. AQUACULTURE, 1994, 126 (3-4) : 315 - 328
  • [2] [Anonymous], P INT
  • [3] [Anonymous], ACOUSTICAL SCI TECHN
  • [4] [Anonymous], JAPANESE J INTERPERS
  • [5] Hatfield E., 1993, Curr. Direct. Psychol. Sci, V2, P96, DOI [10.1111/1467-8721.ep10770953, https://doi.org/10.1111/1467-8721.ep10770953, DOI 10.1111/1467-8721.EP10770953, 10.1111/14678721.ep10770953, DOI 10.1111/14678721.EP10770953]
  • [6] Kawahara H., 2009, Proceedings of Annual Summit and Conference of the Asia-Pacific Signal and Information Processing Association, P111
  • [7] Emotion Recognition Based on Physiological Changes in Music Listening
    Kim, Jonghwa
    Andre, Elisabeth
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2008, 30 (12) : 2067 - 2083
  • [8] DEAP: A Database for Emotion Analysis Using Physiological Signals
    Koelstra, Sander
    Muhl, Christian
    Soleymani, Mohammad
    Lee, Jong-Seok
    Yazdani, Ashkan
    Ebrahimi, Touradj
    Pun, Thierry
    Nijholt, Anton
    Patras, Ioannis
    [J]. IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2012, 3 (01) : 18 - 31
  • [9] Levitan R, 2011, 12TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION 2011 (INTERSPEECH 2011), VOLS 1-5, P3088
  • [10] Valstar Michel F., 2011, Proceedings 2011 IEEE International Conference on Automatic Face & Gesture Recognition (FG 2011), P921, DOI 10.1109/FG.2011.5771374