Brain Region-Based Vigilance Assessment Using Electroencephalography and Eye Tracking Data Fusion

被引:0
|
作者
Abu Farha, Nadia [1 ]
Al-Shargie, Fares [1 ,2 ]
Tariq, Usman [1 ,2 ]
Al-Nashash, Hasan [1 ,2 ]
机构
[1] Amer Univ Sharjah, Biomed Engn Grad Program, Sharjah, U Arab Emirates
[2] Amer Univ Sharjah, Dept Elect Engn, Sharjah, U Arab Emirates
关键词
Electroencephalography; Machine learning; Gaze tracking; Image color analysis; Correlation; Electrodes; Support vector machines; Data integration; Data fusion; EEG; eye tracking; vigilance assessment; canonical correlation analysis (CCA); machine learning; CANONICAL CORRELATION-ANALYSIS; EEG; PERFORMANCE; CLASSIFICATION; ATTENTION; LEVEL; WORK;
D O I
10.1109/ACCESS.2022.3216407
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Vigilance is the capacity to remain alert for an extended time while performing a task. Staying alert is obligatory in many jobs, particularly those that involve monitoring, such as surveillance tasks, security monitoring, and air traffic control. These monitoring tasks require a specific level of arousal to maintain an adequate level of cognitive efficiency. In this study, we investigate the possibility of assessing the vigilance levels using a fusion of electroencephalography (EEG) and eye tracking data. Vigilance levels were established by performing a modified version of the Stroop color word task (SCWT) for 30 minutes. Feature-level fusion based on the canonical correlation analysis (CCA) was employed to each brain region to improve the classification accuracy of vigilance level assessment. Results obtained using support vector machines (SVM) classifier show that fusion of EEG+eye tracking modalities has improved the classification accuracy compared to individual modality. The EEG+Eye tracking fusion on the right central brain region achieved the highest classification accuracy of 97.4 +/- 1.3%, compared to the individual Beta EEG with 92.0 +/- 7.3% and Eye tracking with 76.8 +/- 8.4%, respectively. Likewise, EEG and Eye tracking fusion on the right frontal region showed classification accuracy of 96.9 +/- 1.1% for both the Alpha and Beta bands. Meanwhile, when all brain regions were utilized, the highest classification accuracy of EEG+Eye tracking was 96.8 +/- 0.6% using Delta band compared to the EEG alone with 88.18 +/- 8.5% and eye tracking alone with 76.8 +/- 8.4 %, respectively. The overall results showed that vigilance is a brain region specific and the fusion of EEG+ and Eye tracking data using CCA has significantly improved the classification accuracy of vigilance levels assessment.
引用
收藏
页码:112199 / 112210
页数:12
相关论文
共 50 条
  • [21] Study of Cybersickness Prediction in Real Time Using Eye Tracking Data
    Shimada, Shogo
    Ikei, Yasushi
    Nishiuchi, Nobuyuki
    Yem, Vibol
    2023 IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES ABSTRACTS AND WORKSHOPS, VRW, 2023, : 871 - 872
  • [22] Estimating Self-Confidence in Video-Based Learning Using Eye-Tracking and Deep Neural Networks
    Bhatt, Ankur
    Watanabe, Ko
    Santhosh, Jayasankar
    Dengel, Andreas
    Ishimaru, Shoya
    IEEE ACCESS, 2024, 12 : 192219 - 192229
  • [23] Electroencephalography based fatigue detection using a novel feature fusion and extreme learning machine
    Chen, Jichi
    Wang, Hong
    Hua, Chengcheng
    COGNITIVE SYSTEMS RESEARCH, 2018, 52 : 715 - 728
  • [24] A spasticity assessment method for voluntary movement using data fusion and machine learning
    Chen, Yan
    Yu, Song
    Cai, Qing
    Huang, Shuangyuan
    Ma, Ke
    Zheng, Haiqing
    Xie, Longhan
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2021, 65
  • [25] Autism Spectrum Disorder diagnosis based on trajectories of eye tracking data
    Cardoso, Thiago, V
    Michelassi, Gabriel C.
    Franco, Felipe O.
    Sumiya, Fernando M.
    Portolese, Joana
    Brentani, Helena
    Machado-Lima, Ariane
    Nunes, Fatima L. S.
    2021 IEEE 34TH INTERNATIONAL SYMPOSIUM ON COMPUTER-BASED MEDICAL SYSTEMS (CBMS), 2021, : 50 - 55
  • [26] A MULTISOURCES ELECTRIC WHEELCHAIR CONTROL BASED ON EEG SIGNALS AND FUZZY EYE TRACKING FUSION
    Ben Taher, Fatma
    Ben Amor, Nader
    Jallouli, Mohamed
    CONTROL AND INTELLIGENT SYSTEMS, 2016, 44 (02) : 67 - 74
  • [27] A novel region-based multimodal image fusion technique using improved dictionary learning
    Meher, Bikash
    Agrawal, Sanjay
    Panda, Rutuparna
    Dora, Lingraj
    Abraham, Ajith
    INTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGY, 2020, 30 (03) : 558 - 576
  • [28] ASSESSMENT OF SITUATION AWARENESS FOR SEAFARERS USING EYE-TRACKING DATA
    Virdi, S. S.
    Ng, Yong Thiang
    Liu, Yisi
    Tan, Kelvin
    Zhang, Daniel
    PROCEEDINGS OF ASME 2022 41ST INTERNATIONAL CONFERENCE ON OCEAN, OFFSHORE & ARCTIC ENGINEERING, OMAE2022, VOL 1, 2022,
  • [29] Applying Machine Learning to Intelligent Assessment of Scientific Creativity Based on Scientific Knowledge Structure and Eye-Tracking Data
    Zhang, Yang
    Li, Yangping
    Hu, Weiping
    Bai, Huizhi
    Lyu, Yuanjing
    JOURNAL OF SCIENCE EDUCATION AND TECHNOLOGY, 2025, 34 (02) : 401 - 419
  • [30] Unravelling cognitive processing of in-game brands using eye tracking and electroencephalography: incongruence fosters it
    Aliagas, Irene
    Privado, Jesus
    Merino, Ma Dolores
    CURRENT PSYCHOLOGY, 2024, 43 (08) : 7628 - 7642