TCANet: a temporal convolutional attention network for motor imagery EEG decoding

被引:0
作者
Zhao, Wei [1 ]
Lu, Haodong [1 ]
Zhang, Baocan [1 ]
Zheng, Xinwang [1 ]
Wang, Wenfeng [2 ,3 ]
Zhou, Haifeng [4 ]
机构
[1] Jimei Univ, Chengyi Coll, Xiamen 361021, Peoples R China
[2] Shanghai Inst Technol, Shanghai 200235, Peoples R China
[3] Int Acad Visual Arts & Engn, London CR2 6EQ, England
[4] Jimei Univ, Sch Marine Engn, Xiamen 361021, Peoples R China
关键词
Brain-computer interface (BCI); Deep learning (DL); Motor imagery (MI); Self-attention; Temporal convolutional network (TCN); NEURAL-NETWORK; TRANSFORMER; CLASSIFICATION; TIME;
D O I
10.1007/s11571-025-10275-5
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
Decoding motor imagery electroencephalogram (MI-EEG) signals is fundamental to the development of brain-computer interface (BCI) systems. However, robust decoding remains a challenge due to the inherent complexity and variability of MI-EEG signals. This study proposes the Temporal Convolutional Attention Network (TCANet), a novel end-to-end model that hierarchically captures spatiotemporal dependencies by progressively integrating local, fused, and global features. Specifically, TCANet employs a multi-scale convolutional module to extract local spatiotemporal representations across multiple temporal resolutions. A temporal convolutional module then fuses and compresses these multi-scale features while modeling both short- and long-term dependencies. Subsequently, a stacked multi-head self-attention mechanism refines the global representations, followed by a fully connected layer that performs MI-EEG classification. The proposed model was systematically evaluated on the BCI IV-2a and IV-2b datasets under both subject-dependent and subject-independent settings. In subject-dependent classification, TCANet achieved accuracies of 83.06% and 88.52% on BCI IV-2a and IV-2b respectively, with corresponding Kappa values of 0.7742 and 0.7703, outperforming multiple representative baselines. In the more challenging subject-independent setting, TCANet achieved competitive performance on IV-2a and demonstrated potential for improvement on IV-2b. The code is available at https://github.com/snailpt/TCANet.
引用
收藏
页数:18
相关论文
empty
未找到相关数据