Assessment of Attention-based Deep Learning Architectures for Classifying EEG in ADHD and Typical Children

被引:0
|
作者
Han, Mingzhu [1 ]
Jin, Guoqin [2 ]
Li, Wei [3 ]
机构
[1] Zhejiang Business Coll, Dept Ind Coll Cooperat, Hangzhou 310053, Peoples R China
[2] Zhejiang Business Coll, Org & Personnel Dept, Hangzhou 310053, Peoples R China
[3] Zhejiang Business Coll, Presidents Off, Hangzhou 310053, Peoples R China
关键词
ADHD; EEG; deep learning; attention mechanisms; CNN; LSTM; DEFICIT/HYPERACTIVITY DISORDER; CLASSIFICATION; ADOLESCENTS; PREVALENCE;
D O I
10.14569/IJACSA.2024.0150324
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Although limited research has explored the integration of electroencephalography ( EEG) and deep learning approaches for attention deficit hyperactivity disorder (ADHD) detection, using deep learning models for actual data, including EEGs, remains a difficult endeavour. The purpose of this work was to evaluate how different attention processes affected the performance of well-established deep- learning models for the identification of ADHD. Two specific architectures, namely long short-term memory (LSTM)+ attention (Att) and convolutional neural network (CNN)s+Att, were compared. The CNN+Att model consists of a dropout, an LSTM layer, a dense layer, and a CNN layer merged with the convolutional block attention module (CBAM) structure. On top of the first LSTM layer, an extra LSTM layer, including T LSTM cells, was added for the LSTM+Att model. The information from this stacked LSTM structure was then passed to a dense layer, which, in turn, was connected to the classification layer, which comprised two neurons. Experimental results showed that the best classification result was achieved using the LSTM+Att model with 98.91% accuracy, 99.87% accuracy, 97.79% specificity and 98.87% F1-score. After that, the LSTM, CNN+Att, and CNN models succeeded in classifying ADHD and Normal EEG signals with 98.45%, 97.74% and 97.16% accuracy, respectively. The information in the data was successfully utilized by investigating the application of attention mechanisms and the precise position of the attention layer inside the deep learning model. This fascinating finding creates opportunities for more study on largescale EEG datasets and more reliable information extraction from massive data sets, ultimately allowing links to be made between brain activity and specific behaviours or task execution.
引用
收藏
页码:234 / 243
页数:10
相关论文
共 50 条
  • [41] EEG Biofeedback Improves Attention and Alertness in Children with ADHD
    Sokhadze, Estate
    Sears, Lonnie
    Sokhadze, Guela E.
    APPLIED PSYCHOPHYSIOLOGY AND BIOFEEDBACK, 2011, 36 (03) : 223 - 223
  • [42] Classifying adolescent attention-deficit/hyperactivity disorder (ADHD) based on functional and structural imaging
    Iannaccone, Reto
    Hauser, Tobias U.
    Ball, Juliane
    Brandeis, Daniel
    Walitza, Susanne
    Brem, Silvia
    EUROPEAN CHILD & ADOLESCENT PSYCHIATRY, 2015, 24 (10) : 1279 - 1289
  • [43] Is attention all geosciences need? Advancing quantitative petrography with attention-based deep learning
    Koeshidayatullah, Ardiansyah
    Ferreira-Chacua, Ivan
    Li, Weichang
    COMPUTERS & GEOSCIENCES, 2023, 181
  • [44] Classification of Hand Movements From EEG Using a Deep Attention-Based LSTM Network
    Zhang, Guangyi
    Davoodnia, Vandad
    Sepas-Moghaddam, Alireza
    Zhang, Yaoxue
    Etemad, Ali
    IEEE SENSORS JOURNAL, 2020, 20 (06) : 3113 - 3122
  • [45] Handwriting-Based ADHD Detection for Children Having ASD Using Machine Learning Approaches
    Shin, Jungpil
    Maniruzzaman, Md.
    Uchida, Yuta
    Hasan, Md. Al Mehedi
    Megumi, Akiko
    Yasumura, Akira
    IEEE ACCESS, 2023, 11 : 84974 - 84984
  • [46] Attention-based deep learning for chip-surface-defect detection
    Wang, Shuo
    Wang, Hongyu
    Yang, Fan
    Liu, Fei
    Zeng, Long
    INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGY, 2022, 121 (3-4) : 1957 - 1971
  • [47] A hybrid attention-based deep learning approach for wind power prediction
    Ma, Zhengjing
    Mei, Gang
    APPLIED ENERGY, 2022, 323
  • [48] RoseSegNet: An attention-based deep learning architecture for organ segmentation of plants
    Turgut, Kaya
    Dutagaci, Helin
    Rousseau, David
    BIOSYSTEMS ENGINEERING, 2022, 221 : 138 - 153
  • [49] English to Persian Transliteration using Attention-based Approach in Deep Learning
    Mahsuli, Mohammad Mahdi
    Safabakhsh, Reza
    2017 25TH IRANIAN CONFERENCE ON ELECTRICAL ENGINEERING (ICEE), 2017, : 174 - 178
  • [50] Global Semantic Classification of Fluvial Landscapes with Attention-Based Deep Learning
    Carbonneau, Patrice E.
    REMOTE SENSING, 2024, 16 (24)