An integrated telemedicine platform for the assessment of affective physiological states

被引:0
作者
Christos D Katsis
George Ganiatsas
Dimitrios I Fotiadis
机构
[1] Medical School,Dept. of Medical Physics
[2] University of Ioannina,Unit of Medical Technology and Intelligent Information Systems, Dept. of Computer Science
[3] University of Ioannina,undefined
来源
Diagnostic Pathology | / 1卷
关键词
Facial Expression; Emotion Recognition; Electrodermal Activity; Feature Extraction Module; Animation Video;
D O I
暂无
中图分类号
学科分类号
摘要
AUBADE is an integrated platform built for the affective assessment of individuals. The system performs evaluation of the emotional state by classifying vectors of features extracted from: facial Electromyogram, Respiration, Electrodermal Activity and Electrocardiogram. The AUBADE system consists of: (a) a multisensorial wearable, (b) a data acquisition and wireless communication module, (c) a feature extraction module, (d) a 3D facial animation module which is used for the projection of the obtained data through a generic 3D face model; whereas the end-user will be able to view the facial expression of the subject in real time, (e) an intelligent emotion recognition module, and (f) the AUBADE databases where the acquired signals along with the subject's animation videos are saved. The system is designed to be applied to human subjects operating under extreme stress conditions, in particular car racing drivers, and also to patients suffering from neurological and psychological disorders. AUBADE's classification accuracy into five predefined emotional classes (high stress, low stress, disappointment, euphoria and neutral face) is 86.0%. The pilot system applications and components are being tested and evaluated on Maserati's car. racing drivers.
引用
收藏
相关论文
共 50 条
[1]  
Richins ML(1997)Measuring Emotions in the Consumption Experience Journal of Consumer Research 24 127-146
[2]  
Picard RW(2001)Toward Machine Emotional Intelligence: Analysis of Affective Physiological State IEEE Transactions Pattern Analysis and Machine Intelligence 23 1175-1191
[3]  
Vyzas E(1997)Recognizing facial expressions in image sequences using local parameterized models of image motion International Journal on Computer Vision 25 23-48
[4]  
Healey J(1999)Measuring Facial Expressions by Computer Image Analysis Psychophysiology 36 253-263
[5]  
Black M(1999)Classifying Facial Actions IEEE Trans. Pattern Analysis and Machine Intelligence 21 974-989
[6]  
Yaccob Y(2003)Facial expression recognition from video sequences: Temporal and static modelling Computer Vision and Image Understanding 91 160-187
[7]  
Barlet M(2000)A real-time face and lips tracker with facial expression recognition Pattern Recognition 33 1369-1382
[8]  
Hager JC(1983)Autonomic Nervous system Activity Distinguishes Among Emotions Science 221 1208-1210
[9]  
Ekman P(1984)Facial and Autonomic Manifestations of the Dimensional Stucture of Emotion Journal of Experimental Social Psychology 20 195-216
[10]  
Sejnowski TJ(1978)Applicability of driver's Electrodermal response to the design of the traffic environment Journal of applied Psychology 63 481-488