AttenEpilepsy: A 2D convolutional network model based on multi-head self-attention

被引:0
|
作者
Ma, Shuang [1 ,2 ]
Wang, Haifeng [1 ,2 ]
Yu, Zhihao [1 ,2 ]
Du, Luyao [4 ]
Zhang, Ming [1 ,2 ]
Fu, Qingxi [2 ,3 ]
机构
[1] Linyi Univ, Sch Informat Sci & Engn, Linyi 276000, Shandong, Peoples R China
[2] Linyi Peoples Hosp, Hlth & Med Big Data Ctr, Linyi 276034, Shandong, Peoples R China
[3] Linyi City Peoples Hosp, Linyi Peoples Hosp Shandong Prov, Linyi 276034, Peoples R China
[4] Wuhan Univ Technol, Sch Automation, Wuhan, Peoples R China
关键词
Long-range dependence; Time-frequency image; Feature extraction; Time-frequency context encoding; Multi-head self-attention; Causal convolution; SLEEP STAGE CLASSIFICATION; LEARNING APPROACH;
D O I
10.1016/j.enganabound.2024.105989
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
The existing epilepsy detection models focus more on local information than the true meaning of long-range dependence when capturing time-frequency image features. This results in imprecise feature vector extraction and room for optimization of detection accuracy. AttenEpilepsy is a novel 2D convolutional network model that uses a multi-head self-attention mechanism to classify epileptic seizure periods, inter-seizure periods, and health states of single-channel EEG signals. The AttenEpilepsy model consists of two parts, namely feature extraction and time-frequency context encoding (STCE). A feature extraction method combining multi-path convolution and adaptive hybrid feature recalibration is proposed, in which multi-path convolution with convolution kernels of different sizes is used to extract relevant multi-scale features from time-frequency images. STCE consists of two modules: multi-head self-attention and causal convolution. A modified multi-head self-attention mechanism is used to model the extracted time-frequency features, and causal convolution is used to analyse the frequency information on the time dependencies. A public dataset from the University of Bonn Epilepsy Research Center is used to evaluate the performance of the AttenEpilepsy model. The experimental results show that the AttenEpilepsy model achieved accuracy (AC), sensitivity (SE), specificity (SP), and F1 score (F1) of 99.81%, 99.82%, 99.89%, and 99.83%, respectively. Further testing of the robustness of the model is conducted by introducing various types of noise into the input data. The proposed AttenEpilepsy network model outperforms the state-of-the-art in terms of various evaluation metrics.
引用
收藏
页数:11
相关论文
共 50 条
  • [41] Multi-Head Self-Attention for 3D Point Cloud Classification
    Gao, Xue-Yao
    Wang, Yan-Zhao
    Zhang, Chun-Xiang
    Lu, Jia-Qi
    IEEE ACCESS, 2021, 9 : 18137 - 18147
  • [42] The sentiment analysis model with multi-head self-attention and Tree-LSTM
    Li Lei
    Pei Yijian
    Jin Chenyang
    SIXTH INTERNATIONAL WORKSHOP ON PATTERN RECOGNITION, 2021, 11913
  • [43] Multi-Head Self-Attention Model for Classification of Temporal Lobe Epilepsy Subtypes
    Gu, Peipei
    Wu, Ting
    Zou, Mingyang
    Pan, Yijie
    Guo, Jiayang
    Xiahou, Jianbing
    Peng, Xueping
    Li, Hailong
    Ma, Junxia
    Zhang, Ling
    FRONTIERS IN PHYSIOLOGY, 2020, 11
  • [44] CNN-MHSA: A Convolutional Neural Network and multi-head self-attention combined approach for detecting phishing websites
    Xiao, Xi
    Zhang, Dianyan
    Hu, Guangwu
    Jiang, Yong
    Xia, Shutao
    NEURAL NETWORKS, 2020, 125 : 303 - 312
  • [45] DeepCAC: a deep learning approach on DNA transcription factors classification based on multi-head self-attention and concatenate convolutional neural network
    Jidong Zhang
    Bo Liu
    Jiahui Wu
    Zhihan Wang
    Jianqiang Li
    BMC Bioinformatics, 24
  • [46] A Multi-tab Webpage Fingerprinting Method Based on Multi-head Self-attention
    Xie, Lixia
    Li, Yange
    Yang, Hongyu
    Hu, Ze
    Wang, Peng
    Cheng, Xiang
    Zhang, Liang
    FRONTIERS IN CYBER SECURITY, FCS 2023, 2024, 1992 : 131 - 140
  • [47] DeepCAC: a deep learning approach on DNA transcription factors classification based on multi-head self-attention and concatenate convolutional neural network
    Zhang, Jidong
    Liu, Bo
    Wu, Jiahui
    Wang, Zhihan
    Li, Jianqiang
    BMC BIOINFORMATICS, 2023, 24 (01)
  • [48] DeepMHADTA: Prediction of Drug-Target Binding Affinity Using Multi-Head Self-Attention and Convolutional Neural Network
    Deng, Lei
    Zeng, Yunyun
    Liu, Hui
    Liu, Zixuan
    Liu, Xuejun
    CURRENT ISSUES IN MOLECULAR BIOLOGY, 2022, 44 (05) : 2287 - 2299
  • [49] A spatial-spectral fusion convolutional transformer network with contextual multi-head self-attention for hyperspectral image classification
    Wang, Wuli
    Sun, Qi
    Zhang, Li
    Ren, Peng
    Wang, Jianbu
    Ren, Guangbo
    Liu, Baodi
    NEURAL NETWORKS, 2025, 187
  • [50] Masked multi-head self-attention for causal speech enhancement
    Nicolson, Aaron
    Paliwal, Kuldip K.
    SPEECH COMMUNICATION, 2020, 125 : 80 - 96