Eye State Classification Through Analysis of EEG Data Using Deep Learning

被引:1
作者
Renosa, Claire Receli M. [1 ]
Sybingco, Edwin [1 ]
Vicerra, Ryan Rhay P. [1 ]
Bandala, Argel A. [1 ]
机构
[1] De La Salle Univ, Manila, Philippines
来源
2020 IEEE 12TH INTERNATIONAL CONFERENCE ON HUMANOID, NANOTECHNOLOGY, INFORMATION TECHNOLOGY, COMMUNICATION AND CONTROL, ENVIRONMENT, AND MANAGEMENT (HNICEM) | 2020年
关键词
classification; Deep Learning; eye state; LSTM; MATLAB;
D O I
10.1109/HNICEM51456.2020.9400081
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
the purpose of this study is to create a network that can detect the state at which the eyes are in at a specific time step through analysis of a dataset recorded using a 14-channel Emotiv EEG Neuroheadset. This study can be useful and serve as a supporting input in the development of other researches and systems that considers eye state and movement as an important factor and input, such as driving state detection projects specifically the classification of drowsiness levels. In this paper, deep learning was applied in creating the network, trained with a total of 10,424 data points, validated to classify only two states: eyes open and eyes closed. The network was trained and completed using MATLAB and Microsoft Excel. Accuracy of the classification action between the testing data and the completed output network in this study achieved 89.23% across all 4,468 data points.
引用
收藏
页数:5
相关论文
共 28 条
[1]  
[Anonymous], 2013, UCI MACHINE LEARNING
[2]  
[Anonymous], 2016, C PAPER, P1
[3]  
Billones R. K. C., 2019, 2018 IEEE 10 INT C H, P0
[4]  
D'Zmura M, 2009, LECT NOTES COMPUT SC, V5610, P40, DOI 10.1007/978-3-642-02574-7_5
[5]  
EMOTIV, 2017, EMOTIV EPOC 14 CHANN
[6]  
Galvez RL, 2018, TENCON IEEE REGION, P2023, DOI 10.1109/TENCON.2018.8650517
[7]  
Graves A, 2012, STUD COMPUT INTELL, V385, P37
[8]  
Hargrave M., 2019, INVESTOPEDIA, V26, P436
[9]  
Hasan M, 2014, LECT NOTES COMPUT SC, V8691, P705, DOI 10.1007/978-3-319-10578-9_46
[10]  
Hochreiter S, 1997, NEURAL COMPUT, V9, P1735, DOI [10.1162/neco.1997.9.8.1735, 10.1162/neco.1997.9.1.1, 10.1007/978-3-642-24797-2]