Assessment of Attention-based Deep Learning Architectures for Classifying EEG in ADHD and Typical Children

被引:0
|
作者
Han, Mingzhu [1 ]
Jin, Guoqin [2 ]
Li, Wei [3 ]
机构
[1] Zhejiang Business Coll, Dept Ind Coll Cooperat, Hangzhou 310053, Peoples R China
[2] Zhejiang Business Coll, Org & Personnel Dept, Hangzhou 310053, Peoples R China
[3] Zhejiang Business Coll, Presidents Off, Hangzhou 310053, Peoples R China
关键词
ADHD; EEG; deep learning; attention mechanisms; CNN; LSTM; DEFICIT/HYPERACTIVITY DISORDER; CLASSIFICATION; ADOLESCENTS; PREVALENCE;
D O I
10.14569/IJACSA.2024.0150324
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Although limited research has explored the integration of electroencephalography ( EEG) and deep learning approaches for attention deficit hyperactivity disorder (ADHD) detection, using deep learning models for actual data, including EEGs, remains a difficult endeavour. The purpose of this work was to evaluate how different attention processes affected the performance of well-established deep- learning models for the identification of ADHD. Two specific architectures, namely long short-term memory (LSTM)+ attention (Att) and convolutional neural network (CNN)s+Att, were compared. The CNN+Att model consists of a dropout, an LSTM layer, a dense layer, and a CNN layer merged with the convolutional block attention module (CBAM) structure. On top of the first LSTM layer, an extra LSTM layer, including T LSTM cells, was added for the LSTM+Att model. The information from this stacked LSTM structure was then passed to a dense layer, which, in turn, was connected to the classification layer, which comprised two neurons. Experimental results showed that the best classification result was achieved using the LSTM+Att model with 98.91% accuracy, 99.87% accuracy, 97.79% specificity and 98.87% F1-score. After that, the LSTM, CNN+Att, and CNN models succeeded in classifying ADHD and Normal EEG signals with 98.45%, 97.74% and 97.16% accuracy, respectively. The information in the data was successfully utilized by investigating the application of attention mechanisms and the precise position of the attention layer inside the deep learning model. This fascinating finding creates opportunities for more study on largescale EEG datasets and more reliable information extraction from massive data sets, ultimately allowing links to be made between brain activity and specific behaviours or task execution.
引用
收藏
页码:234 / 243
页数:10
相关论文
共 50 条
  • [21] Detection of ADHD from EEG signals using new hybrid decomposition and deep learning techniques
    Esas, Mustafa Yasin
    Latifoglu, Fatma
    JOURNAL OF NEURAL ENGINEERING, 2023, 20 (03)
  • [22] Neurological state changes indicative of ADHD in children learned via EEG-based LSTM networks
    Chang, Yang
    Stevenson, Cory
    Chen, I-Chun
    Lin, Dar-Shong
    Ko, Li-Wei
    JOURNAL OF NEURAL ENGINEERING, 2022, 19 (01)
  • [23] An Attention-Based Deep Learning Network for Predicting Platinum Resistance in Ovarian Cancer
    Zhuang, Haoming
    Li, Beibei
    Ma, Jingtong
    Monkam, Patrice
    Qian, Wei
    He, Dianning
    IEEE ACCESS, 2024, 12 : 41000 - 41008
  • [24] Generalized attention-based deep multi-instance learning
    Zhao, Lu
    Yuan, Liming
    Hao, Kun
    Wen, Xianbin
    MULTIMEDIA SYSTEMS, 2023, 29 (01) : 275 - 287
  • [25] Generalized attention-based deep multi-instance learning
    Lu Zhao
    Liming Yuan
    Kun Hao
    Xianbin Wen
    Multimedia Systems, 2023, 29 : 275 - 287
  • [26] aDFR: An Attention-Based Deep Learning Model for Flight Ranking
    Yi, Yuan
    Cao, Jian
    Tan, YuDong
    Nie, QiangQiang
    Lu, XiaoXi
    WEB INFORMATION SYSTEMS ENGINEERING, WISE 2020, PT II, 2020, 12343 : 548 - 562
  • [27] Federated deep active learning for attention-based transaction classification
    Usman Ahmed
    Jerry Chun-Wei Lin
    Philippe Fournier-Viger
    Applied Intelligence, 2023, 53 : 8631 - 8643
  • [28] Federated deep active learning for attention-based transaction classification
    Ahmed, Usman
    Lin, Jerry Chun-Wei
    Fournier-Viger, Philippe
    APPLIED INTELLIGENCE, 2023, 53 (08) : 8631 - 8643
  • [29] Multimodal attention-based deep learning for automatic modulation classification
    Han, Jia
    Yu, Zhiyong
    Yang, Jian
    FRONTIERS IN ENERGY RESEARCH, 2023, 10
  • [30] Mobile traffic prediction with attention-based hybrid deep learning
    Wang, Li
    Che, Linxiao
    Lam, Kwok-Yan
    Liu, Wenqiang
    Li, Feng
    PHYSICAL COMMUNICATION, 2024, 66