Dense Residual LSTM-Attention Network for Boiler Steam Temperature Prediction with Uncertainty Analysis

被引:9
|
作者
Tong, Zheming [1 ,2 ]
Chen, Xin [1 ,2 ]
Tong, Shuiguang [1 ,2 ]
Yang, Qi [1 ,2 ]
机构
[1] Zhejiang Univ, State Key Lab Fluid Power & Mechatron Syst, Hangzhou 310027, Peoples R China
[2] Zhejiang Univ, Sch Mech Engn, Hangzhou 310027, Peoples R China
来源
ACS OMEGA | 2022年 / 7卷 / 13期
基金
中国国家自然科学基金;
关键词
NEURAL-NETWORK;
D O I
10.1021/acsomega.2c00615
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Flexible operation of large-scale boilers for electricity generation is essential in modern power systems. An accurate prediction of boiler steam temperature is of great importance to the operational efficiency of boiler units to prevent the occurrence of overtemperature. In this study, a dense, residual long short-term memory network (LSTM)-attention model is proposed for steam temperature prediction. In particular, the residual elements in the proposed model have a great advantage in improving the accuracy by adding short skip connections between layers. To provide overall information for the steam temperature prediction, uncertainty analysis based on the proposed model is performed to quantify the uncertainties in steam temperature variations. Our results demonstrate that the proposed method exhibits great performance in steam temperature prediction with a mean absolute error (MAE) of less than 0.6 degrees C. Compared to algorithms such as support-vector regression (SVR), ridge regression (RIDGE), the recurrent neural network (RNN), the gated recurrent unit (GRU), and LSTM, the prediction accuracy of the proposed model outperforms by 32, 16, 12, 10, and 11% in terms of MAE, respectively. According to our analysis, the dense residual LSTM-attention model is shown to provide an accurate early warning of overtemperature, enabling the development of real-time steam temperature control.
引用
收藏
页码:11422 / 11429
页数:8
相关论文
共 50 条
  • [31] Multi-Dimensional Residual Dense Attention Network for Stereo Matching
    Zhang, Guanghui
    Zhu, Dongchen
    Shi, Wenjun
    Ye, Xiaoqing
    Li, Jiamao
    Zhang, Xiaolin
    IEEE ACCESS, 2019, 7 : 51681 - 51690
  • [32] Image Dehazing Network Based on Residual Dense Block and Attention Mechanism
    Li S.
    Liu H.
    Gan Y.
    Zhu X.
    Zhang J.
    Hunan Daxue Xuebao/Journal of Hunan University Natural Sciences, 2021, 48 (06): : 112 - 118
  • [33] Attention-Based Residual Dense Shrinkage Network for ECG Denoising
    Zhang, Dengyong
    Yuan, Minzhi
    Li, Feng
    Zhang, Lebing
    Sun, Yanqiang
    Ling, Yiming
    CMES-COMPUTER MODELING IN ENGINEERING & SCIENCES, 2024, 138 (03): : 2809 - 2824
  • [34] Airport delay prediction model based on regional residual and LSTM network
    Qu J.
    Ye M.
    Qu X.
    Tongxin Xuebao/Journal on Communications, 2019, 40 (04): : 149 - 159
  • [35] A STACKED RESIDUAL LSTM NETWORK FOR NONLINEAR SEISMIC RESPONSE PREDICTION OF BRIDGES
    Liao Y.-C.
    Zhang R.-Y.
    Lin R.
    Zong Z.-H.
    Wu G.
    Gongcheng Lixue/Engineering Mechanics, 2024, 41 (04): : 47 - 58
  • [36] Vehicle Trajectory Prediction Using Hierarchical LSTM and Graph Attention Network
    Wang, Jiaqin
    Liu, Kai
    Li, Hantao
    Gao, Qiang
    Wang, Xiangfen
    IEEE INTERNET OF THINGS JOURNAL, 2025, 12 (06): : 7010 - 7025
  • [37] Dual Memory LSTM with Dual Attention Neural Network for Spatiotemporal Prediction
    Li, Teng
    Guan, Yepeng
    SENSORS, 2021, 21 (12)
  • [38] LSTM-based graph attention network for vehicle trajectory prediction
    Wang, Jiaqin
    Liu, Kai
    Li, Hantao
    COMPUTER NETWORKS, 2024, 248
  • [39] Groundwater Level Prediction and Earthquake Precursor Anomaly Analysis Based on TCN-LSTM-Attention Network
    Chen, Xinfang
    Yang, Lijia
    Liao, Xin
    Zhao, Hanqing
    Wang, Shiwei
    IEEE ACCESS, 2024, 12 : 176696 - 176718
  • [40] Improved dense residual network with the coordinate and pixel attention mechanisms for helmet detection
    Mi, Jiang
    Luo, Jingrui
    Zhao, Haixia
    Huang, Xingguo
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, 15 (11) : 5015 - 5031