Student Engagement Detection Using Emotion Analysis, Eye Tracking and Head Movement with Machine Learning

被引:28
作者
Sharma, Prabin [1 ]
Joshi, Shubham [2 ]
Gautam, Subash [2 ]
Maharjan, Sneha [3 ]
Khanal, Salik Ram [4 ]
Reis, Manuel Cabral [5 ,6 ]
Barroso, Joao [5 ,7 ]
de Jesus Filipe, Vitor Manuel [5 ,7 ]
机构
[1] Univ Massachusetts, Boston, MA 02125 USA
[2] Kathmandu Univ, Dhulikhel, Nepal
[3] Wentworth Inst Technol, Boston, MA USA
[4] Washington State Univ, Ctr Precis & Automated Agr Syst, Prosser, WA 99164 USA
[5] Univ Tras Os Montes & Alto Douro Vila Real, Vila Real, Portugal
[6] Inst Elect & Informat Engn Aveiro, Aveiro, Portugal
[7] INESC TEC, Porto, Portugal
来源
TECHNOLOGY AND INNOVATION IN LEARNING, TEACHING AND EDUCATION, TECH-EDU 2022 | 2022年 / 1720卷
关键词
E-learning; Student engagement detection; Facial emotion; Eye-head movement; Machine learning; RECOGNITION; SCHOOLS; SYSTEM;
D O I
10.1007/978-3-031-22918-3_5
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
With the increase of distance learning, in general, and e-learning, in particular, having a system capable of determining the engagement of students is of primordial importance, and one of the biggest challenges, both for teachers, researchers and policymakers. Here, we present a system to detect the engagement level of the students. It uses only information provided by the typical built-in web-camera present in a laptop computer, and was designed to work in real time. We combine information about the movements of the eyes and head, and facial emotions to produce a concentration indexwith three classes of engagement: "very engaged", "nominally engaged" and "not engaged at all". The system was tested in a typical e-learning scenario, and the results show that it correctly identifies each period of time where students were "very engaged", "nominally engaged" and "not engaged at all". Additionally, the results also show that the students with best scores also have higher concentration indexes.
引用
收藏
页码:52 / 68
页数:17
相关论文
共 29 条
[1]  
Alkabbany I, 2019, IEEE IMAGE PROC, P3337, DOI [10.1109/ICIP.2019.8803590, 10.1109/icip.2019.8803590]
[2]  
Arriaga O., 2017, Real-time Convolutional Neural Networks for Emotion and Gender Classification, P221
[3]   The reality of virtual schools: A review of the literature [J].
Barbour, Michael K. ;
Reeves, Thomas C. .
COMPUTERS & EDUCATION, 2009, 52 (02) :402-416
[4]  
Bidwell Jonathan., 2011, Behav Res Methods, V49, P113
[5]  
Bradbury N.A., 2016, Attention span during lectures: 8 seconds, 10 minutes, or more?
[6]   The pupil as a measure of emotional arousal and autonomic activation [J].
Bradley, Margaret M. ;
Miccoli, Laura ;
Escrig, Miguel A. ;
Lang, Peter J. .
PSYCHOPHYSIOLOGY, 2008, 45 (04) :602-607
[7]   Xception: Deep Learning with Depthwise Separable Convolutions [J].
Chollet, Francois .
30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, :1800-1807
[8]  
Divjak M., 2009, MVA2009 IAPR C MACHI, P350
[9]  
Ekenel HK, 2009, LECT NOTES COMPUT SC, V5558, P299, DOI 10.1007/978-3-642-01793-3_31
[10]  
Ekman P., 2013, Emotion in the human face: Guidelines for research and an integration of findings, V11