BERT-LSTM network prediction model based on Transformer

被引:0
|
作者
Guo, Jiachen [1 ]
Liu, Jun [1 ]
Yang, Chenxi [1 ]
Dong, Jianguo [2 ]
Wang, Zhengyi [1 ]
Dong Shijian [3 ]
机构
[1] China Univ Min & Technol, Xuzhou, Jiangsu, Peoples R China
[2] Southeast Univ, Sch Automat, Jingnan, Peoples R China
[3] Northeastern Univ, Sch Met, Shenyang, Peoples R China
来源
PROCEEDINGS OF THE 36TH CHINESE CONTROL AND DECISION CONFERENCE, CCDC 2024 | 2024年
关键词
Multivariate systems; time series prediction; Transformer; BERT; CNN-LSTM;
D O I
10.1109/CCDC62350.2024.10588173
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
A network model based on Transformer encoding by using BERT and decoding by using long short-term memory (LSTM) architecture is designed to predict multivariable systems. The BERT-LSTM network model with feature compensation block uses a sliding window to process data. The BERT network is used as the encoder block to achieve the capture of long-distance position features of time series data. The feature compensation block is composed of the CNN-LSTM model in the binary tree framework, which rectify the BERT network's inability to capture the global long-term dependence between time series data. The decoding characteristics of LSTM are used to predict the global long-term dependence features with distance and position information, in order to realize the prediction function of the network. Finally, by compared with the existing network models, the superiority of the proposed network model is verified.
引用
收藏
页码:3098 / 3103
页数:6
相关论文
共 50 条
  • [1] BERT-LSTM model for sarcasm detection in code-mixed social media post
    Pandey, Rajnish
    Singh, Jyoti Prakash
    JOURNAL OF INTELLIGENT INFORMATION SYSTEMS, 2023, 60 (01) : 235 - 254
  • [2] BERT-LSTM model for sarcasm detection in code-mixed social media post
    Rajnish Pandey
    Jyoti Prakash Singh
    Journal of Intelligent Information Systems, 2023, 60 : 235 - 254
  • [3] Explainable BERT-LSTM Stacking for Sentiment Analysis of COVID-19 Vaccination
    Kumar, Abhinav
    Singh, Jyoti Prakash
    Singh, Amit Kumar
    IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2023, : 1 - 11
  • [4] Privacy BERT-LSTM: a novel NLP algorithm for sensitive information detection in textual documents
    Muralitharan J.
    Arumugam C.
    Neural Computing and Applications, 2024, 36 (25) : 15439 - 15454
  • [5] Fake News Classification using transformer based enhanced LSTM and BERT
    Rai N.
    Kumar D.
    Kaushik N.
    Raj C.
    Ali A.
    International Journal of Cognitive Computing in Engineering, 2022, 3 : 98 - 105
  • [6] Implementation of Text Mining to Detect Emotions of Fuel Price Increase using BERT-LSTM Methods
    Subarkah, Pungkas
    Rozaq, Hasri Akbar Awal
    Arsi, Primandani
    Sholikhatin, Siti Alvi
    Riyanto, Riyanto
    Marcos, Hendra
    GAZI UNIVERSITY JOURNAL OF SCIENCE, 2024, 37 (04): : 1707 - 1716
  • [7] Prediction of dissolved gas in transformer oil based on SSA-LSTM model
    Zhang, Yuchuan
    Liu, Didi
    Liu, Huiqian
    Wang, Yuhang
    Wang, Yang
    Zhu, Qingdong
    2022 9TH INTERNATIONAL CONFERENCE ON CONDITION MONITORING AND DIAGNOSIS (CMD), 2022, : 177 - 182
  • [8] AMP-BERT: Prediction of antimicrobial peptide function based on a BERT model
    Lee, Hansol
    Lee, Songyeon
    Lee, Ingoo
    Nam, Hojung
    PROTEIN SCIENCE, 2023, 32 (01)
  • [9] Prediction and analysis of sea surface temperature based on LSTM-transformer model
    Fu, Yu
    Song, Jun
    Guo, Junru
    Fu, Yanzhao
    Cai, Yu
    REGIONAL STUDIES IN MARINE SCIENCE, 2024, 78
  • [10] Prediction of Dissolved Gas Concentration in Transformer Oil Based on PSO-LSTM Model
    Liu K.
    Gou J.
    Luo Z.
    Wang K.
    Xu X.
    Zhao Y.
    Dianwang Jishu/Power System Technology, 2020, 44 (07): : 2778 - 2784