Dense Residual LSTM-Attention Network for Boiler Steam Temperature Prediction with Uncertainty Analysis

被引:9
|
作者
Tong, Zheming [1 ,2 ]
Chen, Xin [1 ,2 ]
Tong, Shuiguang [1 ,2 ]
Yang, Qi [1 ,2 ]
机构
[1] Zhejiang Univ, State Key Lab Fluid Power & Mechatron Syst, Hangzhou 310027, Peoples R China
[2] Zhejiang Univ, Sch Mech Engn, Hangzhou 310027, Peoples R China
来源
ACS OMEGA | 2022年 / 7卷 / 13期
基金
中国国家自然科学基金;
关键词
NEURAL-NETWORK;
D O I
10.1021/acsomega.2c00615
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Flexible operation of large-scale boilers for electricity generation is essential in modern power systems. An accurate prediction of boiler steam temperature is of great importance to the operational efficiency of boiler units to prevent the occurrence of overtemperature. In this study, a dense, residual long short-term memory network (LSTM)-attention model is proposed for steam temperature prediction. In particular, the residual elements in the proposed model have a great advantage in improving the accuracy by adding short skip connections between layers. To provide overall information for the steam temperature prediction, uncertainty analysis based on the proposed model is performed to quantify the uncertainties in steam temperature variations. Our results demonstrate that the proposed method exhibits great performance in steam temperature prediction with a mean absolute error (MAE) of less than 0.6 degrees C. Compared to algorithms such as support-vector regression (SVR), ridge regression (RIDGE), the recurrent neural network (RNN), the gated recurrent unit (GRU), and LSTM, the prediction accuracy of the proposed model outperforms by 32, 16, 12, 10, and 11% in terms of MAE, respectively. According to our analysis, the dense residual LSTM-attention model is shown to provide an accurate early warning of overtemperature, enabling the development of real-time steam temperature control.
引用
收藏
页码:11422 / 11429
页数:8
相关论文
共 50 条
  • [1] SP-LAN: A Stock Prediction Model Based on LSTM-Attention Network
    Sun, Jingyou
    Li, Dong
    Wang, Xing
    Kou, Yue
    Li, Peixuan
    Xie, Yang
    WEB AND BIG DATA. APWEB-WAIM 2022 INTERNATIONAL WORKSHOPS, KGMA 2022, SEMIBDMA 2022, DEEPLUDA 2022, 2023, 1784 : 63 - 74
  • [2] Inter-well reservoir parameter prediction based on LSTM-Attention network and sedimentary microfacies
    Zhang, Muzhen
    Jia, Ailin
    Lei, Zhanxiang
    GEOENERGY SCIENCE AND ENGINEERING, 2024, 235
  • [3] Enhancing satellite clock bias prediction in BDS with LSTM-attention model
    Cai, Chenglin
    Liu, Mingyuan
    Li, Pinchun
    Li, Zexian
    Lv, Kaihui
    GPS SOLUTIONS, 2024, 28 (02)
  • [4] Enhancing satellite clock bias prediction in BDS with LSTM-attention model
    Chenglin Cai
    Mingyuan Liu
    Pinchun Li
    Zexian Li
    Kaihui Lv
    GPS Solutions, 2024, 28
  • [5] LAR: A User Behavior Prediction Model in Server Log Based on LSTM-Attention Network and RSC Algorithm
    Shang, Yingying
    FUZZY SYSTEMS AND DATA MINING VI, 2020, 331 : 304 - 314
  • [6] A Multibranch LSTM-Attention Ensemble Classification Network for Sensor Drift Compensation
    Liang, Zhifang
    Chen, Dongming
    Yang, Liu
    Chen, Yilin
    IEEE SENSORS JOURNAL, 2024, 24 (16) : 25830 - 25841
  • [7] Residual LSTM Attention Network for Object Tracking
    Kim, Hong-In
    Park, Rae-Hong
    IEEE SIGNAL PROCESSING LETTERS, 2018, 25 (07) : 1029 - 1033
  • [8] Crude oil price prediction using CEEMDAN and LSTM-attention with news sentiment index
    Hu, Zhenda
    OIL AND GAS SCIENCE AND TECHNOLOGY-REVUE D IFP ENERGIES NOUVELLES, 2021, 76
  • [9] Prediction of State of Charge for Lead-Acid Battery Based on LSTM-Attention and LightGBM
    Shen, Yindong
    Ge, Yaru
    JOURNAL OF COMPUTING AND INFORMATION SCIENCE IN ENGINEERING, 2024, 24 (09)
  • [10] Deep LSTM and LSTM-Attention Q-learning based reinforcement learning in oil and gas sector prediction
    Oyewola, David Opeoluwa
    Akinwunmi, Sulaiman Awwal
    Omotehinwa, Temidayo Oluwatosin
    KNOWLEDGE-BASED SYSTEMS, 2024, 284