AttenEpilepsy: A 2D convolutional network model based on multi-head self-attention

被引:0
|
作者
Ma, Shuang [1 ,2 ]
Wang, Haifeng [1 ,2 ]
Yu, Zhihao [1 ,2 ]
Du, Luyao [4 ]
Zhang, Ming [1 ,2 ]
Fu, Qingxi [2 ,3 ]
机构
[1] Linyi Univ, Sch Informat Sci & Engn, Linyi 276000, Shandong, Peoples R China
[2] Linyi Peoples Hosp, Hlth & Med Big Data Ctr, Linyi 276034, Shandong, Peoples R China
[3] Linyi City Peoples Hosp, Linyi Peoples Hosp Shandong Prov, Linyi 276034, Peoples R China
[4] Wuhan Univ Technol, Sch Automation, Wuhan, Peoples R China
关键词
Long-range dependence; Time-frequency image; Feature extraction; Time-frequency context encoding; Multi-head self-attention; Causal convolution; SLEEP STAGE CLASSIFICATION; LEARNING APPROACH;
D O I
10.1016/j.enganabound.2024.105989
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
The existing epilepsy detection models focus more on local information than the true meaning of long-range dependence when capturing time-frequency image features. This results in imprecise feature vector extraction and room for optimization of detection accuracy. AttenEpilepsy is a novel 2D convolutional network model that uses a multi-head self-attention mechanism to classify epileptic seizure periods, inter-seizure periods, and health states of single-channel EEG signals. The AttenEpilepsy model consists of two parts, namely feature extraction and time-frequency context encoding (STCE). A feature extraction method combining multi-path convolution and adaptive hybrid feature recalibration is proposed, in which multi-path convolution with convolution kernels of different sizes is used to extract relevant multi-scale features from time-frequency images. STCE consists of two modules: multi-head self-attention and causal convolution. A modified multi-head self-attention mechanism is used to model the extracted time-frequency features, and causal convolution is used to analyse the frequency information on the time dependencies. A public dataset from the University of Bonn Epilepsy Research Center is used to evaluate the performance of the AttenEpilepsy model. The experimental results show that the AttenEpilepsy model achieved accuracy (AC), sensitivity (SE), specificity (SP), and F1 score (F1) of 99.81%, 99.82%, 99.89%, and 99.83%, respectively. Further testing of the robustness of the model is conducted by introducing various types of noise into the input data. The proposed AttenEpilepsy network model outperforms the state-of-the-art in terms of various evaluation metrics.
引用
收藏
页数:11
相关论文
共 50 条
  • [21] Multi-Head Self-Attention Gated-Dilated Convolutional Neural Network for Word Sense Disambiguation
    Zhang, Chun-Xiang
    Zhang, Yu-Long
    Gao, Xue-Yao
    IEEE ACCESS, 2023, 11 : 14202 - 14210
  • [22] MHSAN: Multi-Head Self-Attention Network for Visual Semantic Embedding
    Park, Geondo
    Han, Chihye
    Kim, Daeshik
    Yoon, Wonjun
    2020 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2020, : 1507 - 1515
  • [23] Remaining Useful Life Prediction of Bearings Based on Multi-head Self-attention Mechanism, Multi-scale Temporal Convolutional Network and Convolutional Neural Network
    Wei, Hao
    Gu, Yu
    Zhang, Qinghua
    2023 35TH CHINESE CONTROL AND DECISION CONFERENCE, CCDC, 2023, : 3027 - 3032
  • [24] Multi-head attention graph convolutional network model: End-to-end entity and relation joint extraction based on multi-head attention graph convolutional network
    Tao, Zhihua
    Ouyang, Chunping
    Liu, Yongbin
    Chung, Tonglee
    Cao, Yixin
    CAAI TRANSACTIONS ON INTELLIGENCE TECHNOLOGY, 2023, 8 (02) : 468 - 477
  • [25] ViolenceNet: Dense Multi-Head Self-Attention with Bidirectional Convolutional LSTM for Detecting Violence
    Rendon-Segador, Fernando J.
    Alvarez-Garcia, Juan A.
    Enriquez, Fernando
    Deniz, Oscar
    ELECTRONICS, 2021, 10 (13)
  • [26] Arrhythmia classification algorithm based on multi-head self-attention mechanism
    Wang, Yue
    Yang, Guanci
    Li, Shaobo
    Li, Yang
    He, Ling
    Liu, Dan
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2023, 79
  • [27] A novel two-stream multi-head self-attention convolutional neural network for bearing fault diagnosis
    Ren, Hang
    Liu, Shaogang
    Wei, Fengmei
    Qiu, Bo
    Zhao, Dan
    PROCEEDINGS OF THE INSTITUTION OF MECHANICAL ENGINEERS PART C-JOURNAL OF MECHANICAL ENGINEERING SCIENCE, 2024, 238 (11) : 5393 - 5405
  • [28] Neural News Recommendation with Multi-Head Self-Attention
    Wu, Chuhan
    Wu, Fangzhao
    Ge, Suyu
    Qi, Tao
    Huang, Yongfeng
    Xie, Xing
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 6389 - 6394
  • [29] Multi-head self-attention based gated graph convolutional networks for aspect-based sentiment classification
    Luwei Xiao
    Xiaohui Hu
    Yinong Chen
    Yun Xue
    Bingliang Chen
    Donghong Gu
    Bixia Tang
    Multimedia Tools and Applications, 2022, 81 : 19051 - 19070
  • [30] Multi-head self-attention based gated graph convolutional networks for aspect-based sentiment classification
    Xiao, Luwei
    Hu, Xiaohui
    Chen, Yinong
    Xue, Yun
    Chen, Bingliang
    Gu, Donghong
    Tang, Bixia
    MULTIMEDIA TOOLS AND APPLICATIONS, 2022, 81 (14) : 19051 - 19070