Motion Reveal Emotions: Identifying Emotions From Human Walk Using Chest Mounted Smartphone

被引:27
作者
Hashmi, Muhammad Arslan [1 ]
Riaz, Qaiser [1 ]
Zeeshan, Muhammad [1 ]
Shahzad, Muhammad [1 ]
Fraz, Muhammad Moazam [1 ]
机构
[1] Natl Univ Sci & Technol NUST, Sch Elect Engn & Comp Sci SEECS, Islamabad 44000, Pakistan
关键词
Emotion recognition; Feature extraction; Support vector machines; Sensors; Electroencephalography; Three-dimensional displays; Accelerometers; Accelerometer based emotion recognition; emotion recognition using inertial sensors; gait based emotion recognition; IMU; gyroscopes; smartphone; human motion analysis; RECOGNITION; GAIT; EXPRESSION; INITIATION; MOVEMENT; FACE; AGE;
D O I
10.1109/JSEN.2020.3004399
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Emotion recognition via gait analysis is an active and key area of research because of its significant academic and commercial potential. With recent developments in hardware technology, the use of inertial sensors allows researchers to effectively capture the human motion data for gait analysis. To this end, the aim of this paper is to identify emotions from the inertial signals of human gait recorded by means of body mounted smartphones. We extracted a manually-crafted set of features computed from the human inertial gait data which are used to train and precisely predict the human emotions. Specifically, we collected the inertial gait data of 40 volunteers by means of smartphone's on-board inertial measurement units (3D accelerometer, 3D gyroscope) attached at the chest in six basic emotions including sad, happy, anger, surprise, disgust and fear. Using stride based segmentation, the raw signals are first decomposed into individual strides. For each stride, a set of 296 spectro-temporal features are computed, which are fed into two supervised learning predictors namely Support Vector Machines and Random Forest. The classification results obtained with the proposed methodology and validated with k-fold validation procedure show classification accuracy of 95% for binary emotions and 86% for all six categories of emotions.
引用
收藏
页码:13511 / 13522
页数:12
相关论文
共 61 条
[11]  
Ekman P., 1997, What the face reveals: Basic and applied studies of spontaneous expression using the Facial Action Coding System (FACS)
[12]   Autobiographically Recalled Emotional States Impact Forward Gait Initiation as a Function of Motivational Direction [J].
Fawver, Bradley ;
Hass, Chris J. ;
Park, Kyoungshin D. ;
Janelle, Christopher M. .
EMOTION, 2014, 14 (06) :1125-1136
[13]  
Fox Elaine., 2008, EMOTION SCI COGNITIV, DOI DOI 10.1007/978-1-137-07946-6
[14]   Music-Induced Brain Functional Connectivity Using EEG Sensors: A Study on Indian Music [J].
Geethanjali, B. ;
Adalarasu, K. ;
Jagannath, M. ;
Scshadri, N. P. Guhan .
IEEE SENSORS JOURNAL, 2019, 19 (04) :1499-1507
[15]  
Greenwald M.K., 1989, Journal of Psychophysiology
[16]   Effort-Shape and kinematic assessment of bodily expression of emotion during gait [J].
Gross, M. Melissa ;
Crane, Elizabeth A. ;
Fredrickson, Barbara L. .
HUMAN MOVEMENT SCIENCE, 2012, 31 (01) :202-221
[17]   Methodology for Assessing Bodily Expression of Emotion [J].
Gross, M. Melissa ;
Crane, Elizabeth A. ;
Fredrickson, Barbara L. .
JOURNAL OF NONVERBAL BEHAVIOR, 2010, 34 (04) :223-248
[18]   Dominant and Complementary Emotion Recognition From Still Images of Faces [J].
Guo, Jianzhu ;
Lei, Zhen ;
Wan, Jun ;
Avots, Egils ;
Hajarolasvadi, Noushin ;
Knyazev, Boris ;
Kuharenko, Artem ;
Silveira Jacques Junior, Julio C. ;
Baro, Xavier ;
Demirel, Hasan ;
Escalera, Sergio ;
Allik, Jueri ;
Anbarjafari, Gholamreza .
IEEE ACCESS, 2018, 6 :26391-26403
[19]   Cross-Subject Emotion Recognition Using Flexible Analytic Wavelet Transform From EEG Signals [J].
Gupta, Vipin ;
Chopda, Mayur Dahyabhai ;
Pachori, Ram Bilas .
IEEE SENSORS JOURNAL, 2019, 19 (06) :2266-2274
[20]   What Lies Beneath One's Feet? Terrain Classification Using Inertial Data of Human Walk [J].
Hashmi, Muhammad Zeeshan Ul Hasnain ;
Riaz, Qaiser ;
Hussain, Mehdi ;
Shahzad, Muhammad .
APPLIED SCIENCES-BASEL, 2019, 9 (15)