Towards Real-Time Multimodal Emotion Recognition among Couples

被引:6
作者
Boateng, George [1 ]
机构
[1] Swiss Fed Inst Technol, Zurich, Switzerland
来源
PROCEEDINGS OF THE 2020 INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION, ICMI 2020 | 2020年
基金
瑞士国家科学基金会;
关键词
Emotion Recognition; Multimodal Fusion; Couples; Smartwatches; Machine Learning; Deep Learning; Transfer Learning; NATURALISTIC OBSERVATION; SOCIAL SUPPORT; BEHAVIOR; SPEECH; HEALTH; MODEL;
D O I
10.1145/3382507.3421154
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Researchers are interested in understanding the emotions of couples as it relates to relationship quality and dyadic management of chronic diseases. Currently, the process of assessing emotions is manual, time-intensive, and costly. Despite the existence of works on emotion recognition among couples, there exists no ubiquitous system that recognizes the emotions of couples in everyday life while addressing the complexity of dyadic interactions such as turn-taking in couples' conversations. In this work, we seek to develop a smartwatch-based system that leverages multimodal sensor data to recognize each partner's emotions in daily life. We are collecting data from couples in the lab and in the field and we plan to use the data to develop multimodal machine learning models for emotion recognition. Then, we plan to implement the best models in a smartwatch app and evaluate its performance in real-time and everyday life through another field study. Such a system could enable research both in the lab (e.g. couple therapy) or in daily life (assessment of chronic disease management or relationship quality) and enable interventions to improve the emotional well-being, relationship quality, and chronic disease management of couples.
引用
收藏
页码:748 / 753
页数:6
相关论文
共 84 条
[1]  
[Anonymous], Open Sourcing German BERT
[2]  
[Anonymous], 2011, P 2011 JOINT ACM WOR, DOI DOI 10.1145/2072572.2072576
[3]  
[Anonymous], YAMNet
[4]   The Affective Slider: A Digital Self-Assessment Scale for the Measurement of Human Emotions [J].
Betella, Alberto ;
Verschure, Paul F. M. J. .
PLOS ONE, 2016, 11 (02)
[5]  
Black M, 2010, 11TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION 2010 (INTERSPEECH 2010), VOLS 3 AND 4, P2030
[6]  
Black MP, 2011, 12TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION 2011 (INTERSPEECH 2011), VOLS 1-5, P96
[7]   Toward automating a human behavioral coding system for married couples' interactions using speech acoustic features [J].
Black, Matthew P. ;
Katsamanis, Athanasios ;
Baucom, Brian R. ;
Lee, Chi-Chun ;
Lammert, Adam C. ;
Christensen, Andrew ;
Georgiou, Panayiotis G. ;
Narayanan, Shrikanth S. .
SPEECH COMMUNICATION, 2013, 55 (01) :1-21
[8]   Poster: DyMand - An Open-Source Mobile and Wearable System for Assessing Couples' Dyadic Management of Chronic Diseases [J].
Boateng, George ;
Santhanam, Prabhakaran ;
Luscher, Janina ;
Scholz, Urte ;
Kowatsch, Tobias .
MOBICOM'19: PROCEEDINGS OF THE 25TH ANNUAL INTERNATIONAL CONFERENCE ON MOBILE COMPUTING AND NETWORKING, 2019,
[9]   VADLite: An Open-Source Lightweight System for Real-Time Voice Activity Detection on Smartwatches [J].
Boateng, George ;
Santhanam, Prabhakaran ;
Luscher, Janina ;
Scholz, Urte ;
Kowatsch, Tobias .
UBICOMP/ISWC'19 ADJUNCT: PROCEEDINGS OF THE 2019 ACM INTERNATIONAL JOINT CONFERENCE ON PERVASIVE AND UBIQUITOUS COMPUTING AND PROCEEDINGS OF THE 2019 ACM INTERNATIONAL SYMPOSIUM ON WEARABLE COMPUTERS, 2019, :902-906
[10]  
Boateng G, 2018, INT CONF WEARAB IMPL, P46, DOI 10.1109/BSN.2018.8329655