Motion Reveal Emotions: Identifying Emotions From Human Walk Using Chest Mounted Smartphone

被引:27
作者
Hashmi, Muhammad Arslan [1 ]
Riaz, Qaiser [1 ]
Zeeshan, Muhammad [1 ]
Shahzad, Muhammad [1 ]
Fraz, Muhammad Moazam [1 ]
机构
[1] Natl Univ Sci & Technol NUST, Sch Elect Engn & Comp Sci SEECS, Islamabad 44000, Pakistan
关键词
Emotion recognition; Feature extraction; Support vector machines; Sensors; Electroencephalography; Three-dimensional displays; Accelerometers; Accelerometer based emotion recognition; emotion recognition using inertial sensors; gait based emotion recognition; IMU; gyroscopes; smartphone; human motion analysis; RECOGNITION; GAIT; EXPRESSION; INITIATION; MOVEMENT; FACE; AGE;
D O I
10.1109/JSEN.2020.3004399
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Emotion recognition via gait analysis is an active and key area of research because of its significant academic and commercial potential. With recent developments in hardware technology, the use of inertial sensors allows researchers to effectively capture the human motion data for gait analysis. To this end, the aim of this paper is to identify emotions from the inertial signals of human gait recorded by means of body mounted smartphones. We extracted a manually-crafted set of features computed from the human inertial gait data which are used to train and precisely predict the human emotions. Specifically, we collected the inertial gait data of 40 volunteers by means of smartphone's on-board inertial measurement units (3D accelerometer, 3D gyroscope) attached at the chest in six basic emotions including sad, happy, anger, surprise, disgust and fear. Using stride based segmentation, the raw signals are first decomposed into individual strides. For each stride, a set of 296 spectro-temporal features are computed, which are fed into two supervised learning predictors namely Support Vector Machines and Random Forest. The classification results obtained with the proposed methodology and validated with k-fold validation procedure show classification accuracy of 95% for binary emotions and 86% for all six categories of emotions.
引用
收藏
页码:13511 / 13522
页数:12
相关论文
共 61 条
[1]   EEG-Based Emotion Recognition Using Quadratic Time-Frequency Distribution [J].
Alazrai, Rami ;
Homoud, Rasha ;
Alwanni, Hisham ;
Daoud, Mohammad I. .
SENSORS, 2018, 18 (08)
[2]   Activity recognition from user-annotated acceleration data [J].
Bao, L ;
Intille, SS .
PERVASIVE COMPUTING, PROCEEDINGS, 2004, 3001 :1-17
[3]   Expression of emotion in the kinematics of locomotion [J].
Barliya, Avi ;
Omlor, Lars ;
Giese, Martin A. ;
Berthoz, Alain ;
Flash, Tamar .
EXPERIMENTAL BRAIN RESEARCH, 2013, 225 (02) :159-176
[4]   "Embodied Body Language": an electrical neuroimaging study with emotional faces and bodies [J].
Calbi, Marta ;
Angelini, Monica ;
Gallese, Vittorio ;
Umilta, Maria Alessandra .
SCIENTIFIC REPORTS, 2017, 7
[5]   Heart sound signals can be used for emotion recognition [J].
Cheng Xiefeng ;
Yue Wang ;
Shicheng Dai ;
Pengjun Zhao ;
Qifa Liu .
SCIENTIFIC REPORTS, 2019, 9 (1)
[6]   Crack initiation behavior of notched specimens on heat resistant steel under service type loading at high temperature [J].
Cui, Lu ;
Wang, Peng .
FRATTURA ED INTEGRITA STRUTTURALE, 2016, 10 (38) :26-35
[7]   Motion cues modulate responses to emotion in movies [J].
Dayan, Eran ;
Barliya, Avi ;
de Gelder, Beatrice ;
Hendler, Talma ;
Malach, Rafael ;
Flash, Tamar .
SCIENTIFIC REPORTS, 2018, 8
[8]  
Derawi M. O., 2010, Proceedings of the 2010 Sixth International Conference on Intelligent Information Hiding and Multimedia Signal Processing (IIHMSP 2010), P306, DOI 10.1109/IIHMSP.2010.83
[9]  
Duan RN, 2013, I IEEE EMBS C NEUR E, P81, DOI 10.1109/NER.2013.6695876
[10]   CONSTANTS ACROSS CULTURES IN FACE AND EMOTION [J].
EKMAN, P ;
FRIESEN, WV .
JOURNAL OF PERSONALITY AND SOCIAL PSYCHOLOGY, 1971, 17 (02) :124-&