BERT-LSTM network prediction model based on Transformer

被引:0
|
作者
Guo, Jiachen [1 ]
Liu, Jun [1 ]
Yang, Chenxi [1 ]
Dong, Jianguo [2 ]
Wang, Zhengyi [1 ]
Dong Shijian [3 ]
机构
[1] China Univ Min & Technol, Xuzhou, Jiangsu, Peoples R China
[2] Southeast Univ, Sch Automat, Jingnan, Peoples R China
[3] Northeastern Univ, Sch Met, Shenyang, Peoples R China
来源
PROCEEDINGS OF THE 36TH CHINESE CONTROL AND DECISION CONFERENCE, CCDC 2024 | 2024年
关键词
Multivariate systems; time series prediction; Transformer; BERT; CNN-LSTM;
D O I
10.1109/CCDC62350.2024.10588173
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
A network model based on Transformer encoding by using BERT and decoding by using long short-term memory (LSTM) architecture is designed to predict multivariable systems. The BERT-LSTM network model with feature compensation block uses a sliding window to process data. The BERT network is used as the encoder block to achieve the capture of long-distance position features of time series data. The feature compensation block is composed of the CNN-LSTM model in the binary tree framework, which rectify the BERT network's inability to capture the global long-term dependence between time series data. The decoding characteristics of LSTM are used to predict the global long-term dependence features with distance and position information, in order to realize the prediction function of the network. Finally, by compared with the existing network models, the superiority of the proposed network model is verified.
引用
收藏
页码:3098 / 3103
页数:6
相关论文
共 50 条
  • [21] Rumor detection using BERT-based social circle and interaction network model
    Thirumoorthy, K.
    Britto, J. Jerold John
    Haripriya, P.
    Shreenee, N.
    SOCIAL NETWORK ANALYSIS AND MINING, 2024, 14 (01)
  • [22] BERT2OME: Prediction of 2'-O-Methylation Modifications From RNA Sequence by Transformer Architecture Based on BERT
    Soylu, Necla Nisa
    Sefer, Emre
    IEEE-ACM TRANSACTIONS ON COMPUTATIONAL BIOLOGY AND BIOINFORMATICS, 2023, 20 (03) : 2177 - 2189
  • [23] Multiscale convolutional neural-based transformer network for time series prediction
    Zhixing Wang
    Yepeng Guan
    Signal, Image and Video Processing, 2024, 18 : 1015 - 1025
  • [24] Retrieval of Injection Molding Industrial Knowledge Graph Based on Transformer and BERT
    Zhou, Zhe-Wei
    Jong, Wen-Ren
    Ting, Yu-Hung
    Chen, Shia-Chung
    Chiu, Ming-Chien
    APPLIED SCIENCES-BASEL, 2023, 13 (11):
  • [25] Multiscale convolutional neural-based transformer network for time series prediction
    Wang, Zhixing
    Guan, Yepeng
    SIGNAL IMAGE AND VIDEO PROCESSING, 2024, 18 (02) : 1015 - 1025
  • [26] MPformer: A Transformer-Based Model for Earthen Ruins Climate Prediction
    Xu, Guodong
    Wang, Hai
    Ji, Shuo
    Ma, Yuhui
    Feng, Yi
    TSINGHUA SCIENCE AND TECHNOLOGY, 2024, 29 (06): : 1829 - 1838
  • [27] RoBERTa-LSTM: A Hybrid Model for Sentiment Analysis With Transformer and Recurrent Neural Network
    Tan, Kian Long
    Lee, Chin Poo
    Anbananthen, Kalaiarasi Sonai Muthu
    Lim, Kian Ming
    IEEE ACCESS, 2022, 10 : 21517 - 21525
  • [28] Transformer based Contextual Model for Sentiment Analysis of Customer Reviews: A Fine-tuned BERT A Sequence Learning BERT Model for Sentiment Analysis
    Durairaj, Ashok Kumar
    Chinnalagu, Anandan
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2021, 12 (11) : 474 - 480
  • [29] Rolling Bearing Remaining Useful Life Prediction Based on LSTM-Transformer Algorithm
    Tang, Xinglu
    Xi, Hui
    Chen, Qianqian
    Lin, Tian Ran
    PROCEEDINGS OF INCOME-VI AND TEPEN 2021: PERFORMANCE ENGINEERING AND MAINTENANCE ENGINEERING, 2023, 117 : 207 - 215
  • [30] LBKT: A LSTM BERT-Based Knowledge Tracing Model for Long-Sequence Data
    Li, Zhaoxing
    Yang, Jujie
    Wang, Jindi
    Shi, Lei
    Feng, Jiayi
    Stein, Sebastian
    GENERATIVE INTELLIGENCE AND INTELLIGENT TUTORING SYSTEMS, PT II, ITS 2024, 2024, 14799 : 174 - 184