A Deep-Learning Model for Subject-Independent Human Emotion Recognition Using Electrodermal Activity Sensors

被引:76
作者
Al Machot, Fadi [1 ]
Elmachot, Ali [2 ]
Ali, Mouhannad [3 ]
Al Machot, Elyan [4 ]
Kyamakya, Kyandoghere [3 ]
机构
[1] Res Ctr Borstel, Leibniz Lung Ctr, D-23845 Borstel, Germany
[2] Univ Damascus, Fac Mech & Elect Engn, Damascus, Syria
[3] Alpen Adira Univ, Inst Smart Syst Technol, A-9020 Klagenfurt, Austria
[4] Tech Univ Dresden, Carl Gustav Carus Fac Med, D-01069 Dresden, Germany
关键词
subject-dependent emotion recognition; subject-independent emotion recognition; electrodermal activity (EDA); deep learning; convolutional neural networks; ACCURACY;
D O I
10.3390/s19071659
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
One of the main objectives of Active and Assisted Living (AAL) environments is to ensure that elderly and/or disabled people perform/live well in their immediate environments; this can be monitored by among others the recognition of emotions based on non-highly intrusive sensors such as Electrodermal Activity (EDA) sensors. However, designing a learning system or building a machine-learning model to recognize human emotions while training the system on a specific group of persons and testing the system on a totally a new group of persons is still a serious challenge in the field, as it is possible that the second testing group of persons may have different emotion patterns. Accordingly, the purpose of this paper is to contribute to the field of human emotion recognition by proposing a Convolutional Neural Network (CNN) architecture which ensures promising robustness-related results for both subject-dependent and subject-independent human emotion recognition. The CNN model has been trained using a grid search technique which is a model hyperparameter optimization technique to fine-tune the parameters of the proposed CNN architecture. The overall concept's performance is validated and stress-tested by using MAHNOB and DEAP datasets. The results demonstrate a promising robustness improvement regarding various evaluation metrics. We could increase the accuracy for subject-independent classification to 78% and 82% for MAHNOB and DEAP respectively and to 81% and 85% subject-dependent classification for MAHNOB and DEAP respectively (4 classes/labels). The work shows clearly that while using solely the non-intrusive EDA sensors a robust classification of human emotion is possible even without involving additional/other physiological signals.
引用
收藏
页数:14
相关论文
共 50 条
[31]   Recognition of human activity using GRU deep learning algorithm [J].
Mohsen, Saeed .
MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 82 (30) :47733-47749
[32]   A Novel Deep Learning Bi-GRU-I Model for Real-Time Human Activity Recognition Using Inertial Sensors [J].
Tong, Lina ;
Ma, Hanghang ;
Lin, Qianzhi ;
He, Jiaji ;
Peng, Liang .
IEEE SENSORS JOURNAL, 2022, 22 (06) :6164-6174
[33]   Coarse-Fine Convolutional Deep-Learning Strategy for Human Activity Recognition [J].
Aviles-Cruz, Carlos ;
Ferreyra-Ramirez, Andres ;
Zuniga-Lopez, Arturo ;
Villegas-Cortez, Juan .
SENSORS, 2019, 19 (07)
[34]   Automatic Recognition of Laryngoscopic Images Using a Deep-Learning Technique [J].
Ren, Jianjun ;
Jing, Xueping ;
Wang, Jing ;
Ren, Xue ;
Xu, Yang ;
Yang, Qiuyun ;
Ma, Lanzhi ;
Sun, Yi ;
Xu, Wei ;
Yang, Ning ;
Zou, Jian ;
Zheng, Yongbo ;
Chen, Min ;
Gan, Weigang ;
Xiang, Ting ;
An, Junnan ;
Liu, Ruiqing ;
Lv, Cao ;
Lin, Ken ;
Zheng, Xianfeng ;
Lou, Fan ;
Rao, Yufang ;
Yang, Hui ;
Liu, Kai ;
Liu, Geoffrey ;
Lu, Tao ;
Zheng, Xiujuan ;
Zhao, Yu .
LARYNGOSCOPE, 2020, 130 (11) :E686-E693
[35]   Handwritten Character Recognition Using Deep-Learning [J].
Vaidya, Rohan ;
Trivedi, Darshan ;
Satra, Sagar ;
Pimpale, Mrunalini .
PROCEEDINGS OF THE 2018 SECOND INTERNATIONAL CONFERENCE ON INVENTIVE COMMUNICATION AND COMPUTATIONAL TECHNOLOGIES (ICICCT), 2018, :772-775
[36]   Subject-Independent Emotion Recognition Based on EEG Frequency Band Features and Self-Adaptive Graph Construction [J].
Zhang, Jinhao ;
Hao, Yanrong ;
Wen, Xin ;
Zhang, Chenchen ;
Deng, Haojie ;
Zhao, Juanjuan ;
Cao, Rui .
BRAIN SCIENCES, 2024, 14 (03)
[37]   Device Position-Independent Human Activity Recognition with Wearable Sensors Using Deep Neural Networks [J].
Mekruksavanich, Sakorn ;
Jitpattanakul, Anuchit .
APPLIED SCIENCES-BASEL, 2024, 14 (05)
[38]   Deep Learning in Human Activity Recognition with Wearable Sensors: A Review on Advances [J].
Zhang, Shibo ;
Li, Yaxuan ;
Zhang, Shen ;
Shahabi, Farzad ;
Xia, Stephen ;
Deng, Yu ;
Alshurafa, Nabil .
SENSORS, 2022, 22 (04)
[39]   Evaluation of deep learning model for human activity recognition [J].
Bhat, Owais ;
Khan, Dawood A. .
EVOLVING SYSTEMS, 2022, 13 (01) :159-168
[40]   Evaluation of deep learning model for human activity recognition [J].
Owais Bhat ;
Dawood A Khan .
Evolving Systems, 2022, 13 :159-168