A Spontaneous Driver Emotion Facial Expression (DEFE) Dataset for Intelligent Vehicles: Emotions Triggered by Video-Audio Clips in Driving Scenarios

被引:41
|
作者
Li, Wenbo [1 ,2 ]
Cui, Yaodong [2 ]
Ma, Yintao [2 ]
Chen, Xingxin [2 ]
Li, Guofa [2 ,3 ]
Zeng, Guanzhong [1 ]
Guo, Gang [1 ]
Cao, Dongpu [2 ]
机构
[1] Chongqing Univ, Sch Automot Engn, Chongqing 400044, Peoples R China
[2] Univ Waterloo, Dept Mech & Mechatron Engn, Waterloo, ON N2L 3G1, Canada
[3] Shenzhen Univ, Inst Human Factors & Ergon, Coll Mechatron & Control Engn, Shenzhen 518060, Guangdong, Peoples R China
关键词
Solid modeling; Emotion recognition; Safety; Task analysis; Intelligent vehicles; Vehicle dynamics; Physiology; Driving safety; driver emotion; facial expression dataset; spontaneous expression; affective computing; intelligent vehicles; CULTURAL-DIFFERENCES; MODEL; IDENTIFICATION; RECOGNITION; UNIVERSALS; ANGER;
D O I
10.1109/TAFFC.2021.3063387
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this article, a new dataset, the driver emotion facial expression (DEFE) dataset for drivers' spontaneous emotions analysis is introduced. The dataset includes facial expression recordings from 60 participants during driving. After watching a selected video-audio clip to elicit a speci?c emotion, each participant completed the driving tasks in the same driving scenario and rated his/her emotional responses during the driving processes from the aspects of dimensional emotion method and discrete emotion method. The study also conducted classi?cation experiments to recognize the scales of arousal, valence, dominance, as well as the emotion category and intensity to establish baseline results for the proposed dataset. Furthermore, this paper compared emotion recognition results difference through facial expressions between dynamic driving and static life scenarios. The results showed that dynamic driving and static life datasets were different in emotion recognition results. To further explore the reasons for the difference in emotion recognition results, the analysis from the AU (action unit) presence perspective was studied. The results showed significant differences in the AUs presence of facial expressions between dynamic driving and static life scenarios, indicating that drivers' facial expressions may be affected by the driving task to influence the recognition of drivers' emotions through facial expressions. Therefore, to accurately recognize the drivers' emotions to establish a reliable emotion-aware human-machine interaction system, thereby improving driving safety and comfort, publishing a human emotion dataset speci?cally for the driver is necessary. The proposed dataset will be publicly available so that researchers worldwide can use it to develop and examine their driver emotion analysis methods. To the best of our knowledge, this is currently the only public driver facial expression dataset.
引用
收藏
页码:747 / 760
页数:14
相关论文
empty
未找到相关数据