BERT-LSTM network prediction model based on Transformer

被引:0
|
作者
Guo, Jiachen [1 ]
Liu, Jun [1 ]
Yang, Chenxi [1 ]
Dong, Jianguo [2 ]
Wang, Zhengyi [1 ]
Dong Shijian [3 ]
机构
[1] China Univ Min & Technol, Xuzhou, Jiangsu, Peoples R China
[2] Southeast Univ, Sch Automat, Jingnan, Peoples R China
[3] Northeastern Univ, Sch Met, Shenyang, Peoples R China
来源
PROCEEDINGS OF THE 36TH CHINESE CONTROL AND DECISION CONFERENCE, CCDC 2024 | 2024年
关键词
Multivariate systems; time series prediction; Transformer; BERT; CNN-LSTM;
D O I
10.1109/CCDC62350.2024.10588173
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
A network model based on Transformer encoding by using BERT and decoding by using long short-term memory (LSTM) architecture is designed to predict multivariable systems. The BERT-LSTM network model with feature compensation block uses a sliding window to process data. The BERT network is used as the encoder block to achieve the capture of long-distance position features of time series data. The feature compensation block is composed of the CNN-LSTM model in the binary tree framework, which rectify the BERT network's inability to capture the global long-term dependence between time series data. The decoding characteristics of LSTM are used to predict the global long-term dependence features with distance and position information, in order to realize the prediction function of the network. Finally, by compared with the existing network models, the superiority of the proposed network model is verified.
引用
收藏
页码:3098 / 3103
页数:6
相关论文
共 50 条
  • [41] Configurable Graphic Processing Unit Oriented OpenCL Kernel Performance Prediction Based on LSTM and Transformer
    Li, Bin
    Lei, Junhan
    Jiang, Xiong
    2022 34TH CHINESE CONTROL AND DECISION CONFERENCE, CCDC, 2022, : 3001 - 3006
  • [42] Improving BERT model for requirements classification by bidirectional LSTM-CNN deep model
    Kaur K.
    Kaur P.
    Computers and Electrical Engineering, 2023, 108
  • [43] Bangla-BERT: Transformer-Based Efficient Model for Transfer Learning and Language Understanding
    Kowsher, M.
    Sami, Abdullah A. S.
    Prottasha, Nusrat Jahan
    Arefin, Mohammad Shamsul
    Dhar, Pranab Kumar
    Koshiba, Takeshi
    IEEE ACCESS, 2022, 10 : 91855 - 91870
  • [44] Assessment of bidirectional transformer encoder model and attention based bidirectional LSTM language models for fake news detection
    Choudhary, Anshika
    Arora, Anuja
    JOURNAL OF RETAILING AND CONSUMER SERVICES, 2024, 76
  • [45] Transformer based Model for Coherence Evaluation of Scientific Abstracts: Second Fine-tuned BERT
    Gutierrez-Choque, Anyelo-Carlos
    Medina-Mamani, Vivian
    Castro-Gutierrez, Eveling
    Nunez-Pacheco, Rosa
    Aguaded, Ignacio
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2022, 13 (05) : 929 - 937
  • [46] Improving the Accuracy and Effectiveness of Text Classification Based on the Integration of the Bert Model and a Recurrent Neural Network (RNN_Bert_Based)
    Eang, Chanthol
    Lee, Seungjae
    APPLIED SCIENCES-BASEL, 2024, 14 (18):
  • [47] CAN-BERT do it? Controller Area Network Intrusion Detection System based on BERT Language Model
    Alkhatib, Natasha
    Mushtaq, Maria
    Ghauch, Hadi
    Danger, Jean-Luc
    2022 IEEE/ACS 19TH INTERNATIONAL CONFERENCE ON COMPUTER SYSTEMS AND APPLICATIONS (AICCSA), 2022,
  • [48] BERT-Based Deep Spatial-Temporal Network for Taxi Demand Prediction
    Cao, Dun
    Zeng, Kai
    Wang, Jin
    Sharma, Pradip Kumar
    Ma, Xiaomin
    Liu, Yonghe
    Zhou, Siyuan
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2022, 23 (07) : 9442 - 9454
  • [49] Long-Term Prediction of Network Security Situation Through the Use of the Transformer-Based Model
    Yin, Kun
    Yang, Yu
    Yao, Chengpeng
    Yang, Jinwei
    IEEE ACCESS, 2022, 10 : 56145 - 56157
  • [50] Automatic Prediction and Linguistic Interpretation of Chinese Directional Complements Based on BERT Model
    Jeong, Young Hoon
    Li, Ming Yue
    Kang, Su Min
    Eum, Yun Kyung
    Kang, Byeong Kwu
    CHINESE LEXICAL SEMANTICS, CLSW 2021, PT I, 2022, 13249 : 405 - 416