BERT-LSTM network prediction model based on Transformer

被引:0
|
作者
Guo, Jiachen [1 ]
Liu, Jun [1 ]
Yang, Chenxi [1 ]
Dong, Jianguo [2 ]
Wang, Zhengyi [1 ]
Dong Shijian [3 ]
机构
[1] China Univ Min & Technol, Xuzhou, Jiangsu, Peoples R China
[2] Southeast Univ, Sch Automat, Jingnan, Peoples R China
[3] Northeastern Univ, Sch Met, Shenyang, Peoples R China
来源
PROCEEDINGS OF THE 36TH CHINESE CONTROL AND DECISION CONFERENCE, CCDC 2024 | 2024年
关键词
Multivariate systems; time series prediction; Transformer; BERT; CNN-LSTM;
D O I
10.1109/CCDC62350.2024.10588173
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
A network model based on Transformer encoding by using BERT and decoding by using long short-term memory (LSTM) architecture is designed to predict multivariable systems. The BERT-LSTM network model with feature compensation block uses a sliding window to process data. The BERT network is used as the encoder block to achieve the capture of long-distance position features of time series data. The feature compensation block is composed of the CNN-LSTM model in the binary tree framework, which rectify the BERT network's inability to capture the global long-term dependence between time series data. The decoding characteristics of LSTM are used to predict the global long-term dependence features with distance and position information, in order to realize the prediction function of the network. Finally, by compared with the existing network models, the superiority of the proposed network model is verified.
引用
收藏
页码:3098 / 3103
页数:6
相关论文
共 50 条
  • [31] Personality BERT: A Transformer-Based Model for Personality Detection from Textual Data
    Jain, Dipika
    Kumar, Akshi
    Beniwal, Rohit
    PROCEEDINGS OF INTERNATIONAL CONFERENCE ON COMPUTING AND COMMUNICATION NETWORKS (ICCCN 2021), 2022, 394 : 515 - 522
  • [32] BERT-siRNA: siRNA target prediction based on BERT pre-trained interpretable model
    Xu, Jiayu
    Xu, Nan
    Xie, Weixin
    Zhao, Chengkui
    Yu, Lei
    Feng, Weixing
    GENE, 2024, 910
  • [33] BSAM: A BERT-Based Model with Statistical Information for Personality Prediction
    Xu, Bin
    Wang, Tongqing
    Gao, Kening
    Zhang, Zhaowu
    WEB AND BIG DATA, PT II, APWEB-WAIM 2022, 2023, 13422 : 538 - 545
  • [34] A Hybrid Transformer-LSTM Model With 3D Separable Convolution for Video Prediction
    Mathai, Mareeta
    Liu, Ying
    Ling, Nam
    IEEE ACCESS, 2024, 12 : 39589 - 39602
  • [35] A Novel EEMD-LSTM Combined Model for Transformer Top-Oil Temperature Prediction
    Deng, Wei
    Yang, Jiangwei
    Liu, Yi
    Wu, Chao
    Zhao, Yongsheng
    Liu, Xuan
    You, Jinliang
    2021 8TH INTERNATIONAL FORUM ON ELECTRICAL ENGINEERING AND AUTOMATION, IFEEA, 2021, : 52 - 56
  • [36] A scientific research topic trend prediction model based on multi-LSTM and graph convolutional network
    Xu, Mingying
    Du, Junping
    Xue, Zhe
    Guan, Zeli
    Kou, Feifei
    Shi, Lei
    INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2022, 37 (09) : 6331 - 6353
  • [37] Temperature Time Series Prediction Model Based on Time Series Decomposition and Bi-LSTM Network
    Zhang, Kun
    Huo, Xing
    Shao, Kun
    MATHEMATICS, 2023, 11 (09)
  • [38] Transformer-based attention network for stock movement prediction
    Zhang, Qiuyue
    Qin, Chao
    Zhang, Yunfeng
    Bao, Fangxun
    Zhang, Caiming
    Liu, Peide
    EXPERT SYSTEMS WITH APPLICATIONS, 2022, 202
  • [39] Asphalt Pavement Health Prediction Based on Improved Transformer Network
    Han, Chengjia
    Ma, Tao
    Gu, Linhao
    Cao, Jinde
    Shi, Xinli
    Huang, Wei
    Tong, Zheng
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2023, 24 (04) : 4482 - 4493
  • [40] Long text semantic matching model based on BERT and densecomposite network
    Chen Y.-L.
    Gao Z.-C.
    Cai X.-D.
    Jilin Daxue Xuebao (Gongxueban)/Journal of Jilin University (Engineering and Technology Edition), 2024, 54 (01): : 232 - 239