Detecting Driver Behavior Using Stacked Long Short Term Memory Network With Attention Layer

被引:26
作者
Kouchak, Shokoufeh Monjezi [1 ]
Gaffar, Ashraf [1 ]
机构
[1] Arizona State Univ, Ira Fulton Sch Engn, Tempe, AZ 85287 USA
关键词
Automobiles; Brain modeling; Data models; Accidents; Task analysis; Safety; Artificial neural networks; attention network; recurrent neural network; LSTM; vehicle safety; DISTRACTION;
D O I
10.1109/TITS.2020.2986697
中图分类号
TU [建筑科学];
学科分类号
0813 ;
摘要
Driver distraction is one of the primary reasons for fatal car accidents. Modern cars with advanced infotainment systems often take some cognitive attention away from the road, consequently causing more distraction. Driver behavior analysis can be used to address the driver distraction problem. Three important features of intelligence and cognition are perception, attention and sensory memory. In this work, we use a stacked LSTM network with attention to detect driver distraction using driving data and compare this model with both stacked LSTM and MLP models to show the positive effect of using attention mechanism on the model's performance. We conducted an experiment with eight driving scenarios and collected a large dataset of driving data. First, an MLP was built to detect driver distraction. Next, we increased the intelligence level of the system by using an LSTM network. Third, we used the attention mechanism increment on the top of the LSTM model to enhance the model performance. We show that these three increments increase intelligence by reducing train and test error. The minimum train and test error of the stacked LSTM were 0.57 and 0.9 that were 0.4 less than the MLP minimum train and test error. Adding attention to the stacked LSTM model decreased the train and test error to 0.69 and 0.75. Results also show diminished the overfitting problem and reduction in computational expenses when adding attention.
引用
收藏
页码:3420 / 3429
页数:10
相关论文
共 56 条
[31]   Attention, Distraction, and Cognitive Control Under Load [J].
Lavie, Nilli .
CURRENT DIRECTIONS IN PSYCHOLOGICAL SCIENCE, 2010, 19 (03) :143-148
[32]   Deep learning [J].
LeCun, Yann ;
Bengio, Yoshua ;
Hinton, Geoffrey .
NATURE, 2015, 521 (7553) :436-444
[33]   Towards a new framework for advanced driver assistance systems [J].
Ledezma-Zavala, Edgar ;
Ramrez-Mendoza, Ricardo A. .
INTERNATIONAL JOURNAL OF INTERACTIVE DESIGN AND MANUFACTURING - IJIDEM, 2018, 12 (01) :215-223
[34]   Speech-based interaction with in-vehicle computers: The effect of speech-based e-mail on drivers' attention to the roadway [J].
Lee, JD ;
Caven, B ;
Haake, S ;
Brown, TL .
HUMAN FACTORS, 2001, 43 (04) :631-640
[35]  
Lee John D, 2014, Ann Adv Automot Med, V58, P24
[36]   Cognitive Cars: A New Frontier for ADAS Research [J].
Li, Li ;
Wen, Ding ;
Zheng, Nan-Ning ;
Shen, Lin-Cheng .
IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2012, 13 (01) :395-407
[37]   Nonintrusive detection of driver cognitive distraction in real time using Bayesian networks [J].
Liang, Yulan ;
Lee, John D. ;
Reyes, Michelle L. .
TRANSPORTATION RESEARCH RECORD, 2007, (2018) :1-8
[38]   Driver Distraction Detection Using Semi-Supervised Machine Learning [J].
Liu, Tianchi ;
Yang, Yan ;
Huang, Guang-Bin ;
Yeo, Yong Kiang ;
Lin, Zhiping .
IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2016, 17 (04) :1108-1120
[39]   Brain Dynamics in Predicting Driving Fatigue Using a Recurrent Self-Evolving Fuzzy Neural Network [J].
Liu, Yu-Ting ;
Lin, Yang-Yin ;
Wu, Shang-Lin ;
Chuang, Chun-Hsiang ;
Lin, Chin-Teng .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2016, 27 (02) :347-360
[40]  
Mandic D., 2010, RECURRENT NEURAL NET