Automated Affective Computing Based on Bio-Signals Analysis and Deep Learning Approach

被引:20
|
作者
Filippini, Chiara [1 ]
Di Crosta, Adolfo [2 ]
Palumbo, Rocco [2 ]
Perpetuini, David [1 ]
Cardone, Daniela [1 ]
Ceccato, Irene [1 ]
Di Domenico, Alberto [2 ]
Merla, Arcangelo [1 ]
机构
[1] Univ G dAnnunzio, Dept Neurosci Imaging & Clin Sci, I-66100 Chieti, Italy
[2] Univ G dAnnunzio, Dept Psychol Hlth & Terr Sci, I-66100 Chieti, Italy
关键词
affective computing; emotion recognition; infrared imaging; thermal imaging; HEART-RATE-VARIABILITY; CIRCUMPLEX MODEL; EMOTION RECOGNITION; WORKING-MEMORY; BASIC EMOTIONS; VALENCE; AROUSAL; MACHINE; ADAPTATION; CAMERA;
D O I
10.3390/s22051789
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Extensive possibilities of applications have rendered emotion recognition ineluctable and challenging in the fields of computer science as well as in human-machine interaction and affective computing. Fields that, in turn, are increasingly requiring real-time applications or interactions in everyday life scenarios. However, while extremely desirable, an accurate and automated emotion classification approach remains a challenging issue. To this end, this study presents an automated emotion recognition model based on easily accessible physiological signals and deep learning (DL) approaches. As a DL algorithm, a Feedforward Neural Network was employed in this study. The network outcome was further compared with canonical machine learning algorithms such as random forest (RF). The developed DL model relied on the combined use of wearables and contactless technologies, such as thermal infrared imaging. Such a model is able to classify the emotional state into four classes, derived from the linear combination of valence and arousal (referring to the circumplex model of affect's four-quadrant structure) with an overall accuracy of 70% outperforming the 66% accuracy reached by the RF model. Considering the ecological and agile nature of the technique used the proposed model could lead to innovative applications in the affective computing field.
引用
收藏
页数:19
相关论文
共 50 条
  • [1] EMOTIONAL STRESS RECOGNITION SYSTEM FOR AFFECTIVE COMPUTING BASED ON BIO-SIGNALS
    Hosseini, Seyyed Abed
    Khalilzadeh, Mohammad Ali
    Changiz, Sahar
    JOURNAL OF BIOLOGICAL SYSTEMS, 2010, 18 : 101 - 114
  • [2] Deep learning-based classification of multichannel bio-signals using directedness transfer learning
    Bahador, Nooshin
    Kortelainen, Jukka
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2022, 72
  • [3] Deep learning-based classification of multichannel bio-signals using directedness transfer learning
    Bahador, Nooshin
    Kortelainen, Jukka
    Biomedical Signal Processing and Control, 2022, 72
  • [4] Prediction of Health Problems Using Deep Learning Images and Bio-Signals
    Lee, Min-Hye
    Mun, Hyung-Jin
    Kang, Sun-Kyoung
    APPLIED SCIENCES-BASEL, 2022, 12 (23):
  • [5] Evaluation of quantitative glare technique based on the analysis of bio-signals
    Lee, Ho-Sang
    Kim, Jung-Yong
    Subramaniyam, Murali
    Park, Sangho
    Min, Seung-Nam
    ERGONOMICS, 2017, 60 (10) : 1376 - 1383
  • [6] Detecting Concentration Condition by Analysis System of Bio-signals for Effective Learning
    Yajima, Kuniaki
    Takeichi, Yoshihiro
    Sato, Jun
    INFORMATION AND COMMUNICATION TECHNOLOGY (ICICT 2016), 2018, 625 : 81 - 89
  • [7] Deep learning based affective computing
    Kumar, Saurabh
    JOURNAL OF ENTERPRISE INFORMATION MANAGEMENT, 2021, 34 (05) : 1551 - 1575
  • [8] Control of a tilt table based on bio-signals
    Cho, J
    Seo, JY
    MODELLING AND CONTROL IN BIOMEDICAL SYSTEMS 2003 (INCLUDING BIOLOGICAL SYSTEMS), 2003, : 251 - 253
  • [9] Data Augmentation Methods for Machine-learning-based Classification of Bio-signals
    Sakai, Asuka
    Minoda, Yuki
    Morikawa, Koji
    2017 10TH BIOMEDICAL ENGINEERING INTERNATIONAL CONFERENCE (BMEICON), 2017,
  • [10] Analysis of Bio-Signals for Drivers' Stress Level Detection
    Yaman, Betul Nurefsan
    Isikli Esener, Idil
    2019 MEDICAL TECHNOLOGIES CONGRESS (TIPTEKNO), 2019, : 405 - 408