Deep convolution network based emotion analysis towards mental health care

被引:99
作者
Fei, Zixiang [1 ]
Yang, Erfu [1 ]
Li, David Day-Uei [2 ]
Butler, Stephen [3 ]
Ijomah, Winifred [1 ]
Li, Xia [4 ]
Zhou, Huiyu [5 ]
机构
[1] Univ Strathclyde, Dept Design Mfg & Engn Management, Glasgow G1 1XJ, Lanark, Scotland
[2] Univ Strathclyde, Strathclyde Inst Pharm & Biomed Sci, Glasgow G4 0RE, Lanark, Scotland
[3] Univ Strathclyde, Sch Psychol Sci & Hlth, Glasgow G1 1QE, Lanark, Scotland
[4] Shanghai Jiao Tong Univ, Shanghai, Peoples R China
[5] Univ Leicester, Dept Informat, Leicester LE1 7RH, Leics, England
基金
英国工程与自然科学研究理事会;
关键词
Facial expression recognition; Deep convolution network; Mental health care; Emotion analysis; FACIAL EXPRESSION RECOGNITION; NEURAL-NETWORK; EXPERIENCE; FACE;
D O I
10.1016/j.neucom.2020.01.034
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Facial expressions play an important role during communications, allowing information regarding the emotional state of an individual to be conveyed and inferred. Research suggests that automatic facial expression recognition is a promising avenue of enquiry in mental healthcare, as facial expressions can also reflect an individual's mental state. In order to develop user-friendly, low-cost and effective facial expression analysis systems for mental health care, this paper presents a novel deep convolution network based emotion analysis framework to support mental state detection and diagnosis. The proposed system is able to process facial images and interpret the temporal evolution of emotions through a new solution in which deep features are extracted from the Fully Connected Layer 6 of the AlexNet, with a standard Linear Discriminant Analysis Classifier exploited to obtain the final classification outcome. It is tested against 5 benchmarking databases, including JAFFE, KDEF,CK+, and databases with the images obtained 'in the wild' such as FER2013 and AffectNet. Compared with the other state-of-the-art methods, we observe that our method has overall higher accuracy of facial expression recognition. Additionally, when compared to the state-of-the-art deep learning algorithms such as Vgg16, GoogleNet, ResNet and AlexNet, the proposed method demonstrated better efficiency and has less device requirements. The experiments presented in this paper demonstrate that the proposed method outperforms the other methods in terms of accuracy and efficiency which suggests it could act as a smart, low-cost, user-friendly cognitive aid to detect, monitor, and diagnose the mental health of a patient through automatic facial expression analysis. (C) 2020 Published by Elsevier B.V.
引用
收藏
页码:212 / 227
页数:16
相关论文
共 49 条
[1]  
[Anonymous], 2008, P 25 INT C MACH LEAR, DOI DOI 10.1145/1390156.1390177
[2]  
[Anonymous], P 3 IEEE INT C AUT F
[3]  
[Anonymous], 2018, PROC INT WORKSH ADV
[4]  
Bahr G.S., 2007, NONVERBALLY SMART US
[5]   Reading the mind in the face: A cross-cultural and developmental study [J].
BaronCohen, S ;
Riviere, A ;
Fukushima, M ;
French, D ;
Hadwin, J ;
Cross, P ;
Bryant, C ;
Sotillo, M .
VISUAL COGNITION, 1996, 3 (01) :39-59
[6]  
Branco Pedro., 2005, CHI'05 extended abstracts on Human factors in computing systems - CHI'05, P1236
[7]   Emotional experience and facial expression in Alzheimer's disease [J].
Burton, Keith W. ;
Kaszniak, Alfred W. .
AGING NEUROPSYCHOLOGY AND COGNITION, 2006, 13 (3-4) :636-651
[8]  
Chavan U., 2019, ADV INTELLIGENT SYST, P185, DOI [10.1007/978-981-13-1402-5_14, DOI 10.1007/978-981-13-1402-5_14]
[9]  
Chen XP, 2017, INT CONF ASIC, P815, DOI 10.1109/ASICON.2017.8252601
[10]  
Engin D, 2018, EUR SIGNAL PR CONF, P1795, DOI 10.23919/EUSIPCO.2018.8553087