Decoding text from electroencephalography signals: A novel Hierarchical Gated Recurrent Unit with Masked Residual Attention Mechanism

被引:0
|
作者
Chen, Qiupu [1 ,2 ]
Wang, Yimou [1 ,4 ]
Wang, Fenmei [1 ,2 ,3 ]
Sun, Duolin [1 ]
Li, Qiankun [4 ]
机构
[1] Univ Sci & Technol China, Sci Isl Branch, Grad Sch, Hefei, Peoples R China
[2] Chinese Acad Sci, Inst Intelligent Machines, Hefei Inst Phys Sci, Hefei, Anhui, Peoples R China
[3] PLA Army Acad Artillery & Air Def, Hefei 230031, Peoples R China
[4] Univ Sci & Technol China, Dept Automat, Hefei, Peoples R China
关键词
Brain to text; Brain decoding; Brain-machine interface; Neurolinguistics; Deep learning;
D O I
10.1016/j.engappai.2024.109615
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Progress in both neuroscience and natural language processing has opened doors for investigating brain to text techniques to reconstruct what individuals see, perceive, or focus on from human brain activity patterns. Non-invasive decoding, utilizing electroencephalography (EEG) signals, is preferred due to its comfort, costeffectiveness, and portability. In brain-to-text applications, a pressing need has arisen to develop effective models that can accurately capture the intricate details of EEG signals, such as global and local contextual information and long-term dependencies. In response to this need, we propose the Hierarchical Gated Recurrent Unit with Masked Residual Attention Mechanism (HGRU-MRAM) model, which ingeniously combines the hierarchical structure and the masked residual attention mechanism to deliver a robust brain-to-text decoding system. Our experimental results on the ZuCo dataset demonstrate that this model significantly outperforms existing baselines, achieving state-of-the-art performance with Bilingual Evaluation Understudy Score (BLEU), Recall-Oriented Understudy for Gisting Evaluation (ROUGE), US National Institute of Standards and Technology Metric (NIST), Metric for Evaluation of Translation with Explicit Ordering (METEOR), Translation Edit Rate (TER), and BiLingual Evaluation Understudy with Representations from Transformers (BLEURT) scores of 48.29, 34.84, 4.07, 34.57, 21.98, and 40.45, respectively. The code is available at https://github.com/qpuchen/ EEG-To-Sentence.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Hierarchical Gated Recurrent Unit with Semantic Attention for Event Prediction
    Su, Zichun
    Jiang, Jialin
    FUTURE INTERNET, 2020, 12 (02):
  • [2] HAN-ReGRU: hierarchical attention network with residual gated recurrent unit for emotion recognition in conversation
    Ma, Hui
    Wang, Jian
    Qian, Lingfei
    Lin, Hongfei
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (07): : 2685 - 2703
  • [3] HAN-ReGRU: hierarchical attention network with residual gated recurrent unit for emotion recognition in conversation
    Hui Ma
    Jian Wang
    Lingfei Qian
    Hongfei Lin
    Neural Computing and Applications, 2021, 33 : 2685 - 2703
  • [4] A model for electroencephalogram emotion recognition: Residual block-gated recurrent unit with attention mechanism
    Wang, Yujie
    Zhang, Xiu
    Zhang, Xin
    Sun, Baiwei
    Xu, Bingyue
    REVIEW OF SCIENTIFIC INSTRUMENTS, 2024, 95 (08):
  • [5] A novel tool wear monitoring approach based on attention mechanism and gated recurrent unit
    Zhang, Lei
    Zhao, Zhengcai
    Zheng, Shichen
    Qian, Ning
    Li, Yao
    Xu, Jiuhua
    Huan, Haixiang
    MACHINING SCIENCE AND TECHNOLOGY, 2025, 29 (02) : 187 - 211
  • [6] Residual stacked gated recurrent unit with encoder–decoder architecture and an attention mechanism for temporal traffic prediction
    R. J. Kuo
    D. A. Kunarsito
    Soft Computing, 2022, 26 : 8617 - 8633
  • [7] A Novel Residual Gated Recurrent Unit Framework for Runoff Forecasting
    Sheng, Ziyu
    Wen, Shiping
    Feng, Zhong-Kai
    Shi, Kaibo
    Huang, Tingwen
    IEEE INTERNET OF THINGS JOURNAL, 2023, 10 (14) : 12736 - 12748
  • [8] A multi-energy loads forecasting model based on dual attention mechanism and multi-scale hierarchical residual network with gated recurrent unit
    Chen, Wenhao
    Rong, Fei
    Lin, Chuan
    ENERGY, 2025, 320
  • [9] Residual stacked gated recurrent unit with encoder-decoder architecture and an attention mechanism for temporal traffic prediction
    Kuo, R. J.
    Kunarsito, D. A.
    SOFT COMPUTING, 2022, 26 (17) : 8617 - 8633
  • [10] Connected vehicle following control based on gated recurrent unit with attention mechanism
    Wang, Shengjie
    Pan, Deng
    Chen, Xianda
    Duan, Zexin
    Xu, Zehao
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2025, 142