CHP Engine Anomaly Detection Based on Parallel CNN-LSTM with Residual Blocks and Attention

被引:5
|
作者
Chung, Won Hee [1 ]
Gu, Yeong Hyeon [1 ]
Yoo, Seong Joon [2 ]
机构
[1] Sejong Univ, Artificial Intelligence Dept, Seoul 05006, South Korea
[2] Sejong Univ, Comp Sci & Engn Dept, Seoul 05006, South Korea
关键词
engine anomaly detection; convolutional neural network; long short-term memory; residual block; attention mechanism; Bayesian optimization; FAULT-DIAGNOSIS; SCADA DATA; MODEL; POWER; SYSTEM;
D O I
10.3390/s23218746
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
The extreme operating environment of the combined heat and power (CHP) engine is likely to cause anomalies and defects, which can lead to engine failure; thus, detecting engine anomalies is essential. In this study, we propose a parallel convolutional neural network-long short-term memory (CNN-LSTM) residual blocks attention (PCLRA) anomaly detection model with engine sensor data. To our knowledge, this is the first time that parallel CNN-LSTM-based networks have been used in the field of CHP engine anomaly detection. In PCLRA, spatiotemporal features are extracted via CNN-LSTM in parallel and the information loss is compensated using the residual blocks and attention mechanism. The performance of PCLRA is compared with various hybrid models for 15 cases. First, the performances of serial and parallel models are compared. In addition, we evaluated the contributions of the residual blocks and attention mechanism to the performance of the CNN-LSTM hybrid model. The results indicate that PCLRA achieves the best performance, with a macro f1 score (mean +/- standard deviation) of 0.951 +/- 0.033, an anomaly f1 score of 0.903 +/- 0.064, and an accuracy of 0.999 +/- 0.002. We expect that the energy efficiency and safety of CHP engines can be improved by applying the PCLRA anomaly detection model.
引用
收藏
页数:22
相关论文
共 50 条
  • [1] District heater load forecasting based on machine learning and parallel CNN-LSTM attention
    Chung, Won Hee
    Gu, Yeong Hyeon
    Yoo, Seong Joon
    ENERGY, 2022, 246
  • [2] Partial discharge detection of insulated conductors based on CNN-LSTM of attention mechanisms
    Zhongzhi Li
    Na Qu
    Xiaoxue Li
    Jiankai Zuo
    Yanzhen Yin
    Journal of Power Electronics, 2021, 21 : 1030 - 1040
  • [3] Partial discharge detection of insulated conductors based on CNN-LSTM of attention mechanisms
    Li, Zhongzhi
    Qu, Na
    Li, Xiaoxue
    Zuo, Jiankai
    Yin, Yanzhen
    JOURNAL OF POWER ELECTRONICS, 2021, 21 (07) : 1030 - 1040
  • [4] Cooperative spectrum sensing method based on channel attention and parallel CNN-LSTM
    Bai, Weiwei
    Zheng, Guoqiang
    Mu, Yu
    Ma, Huahong
    Han, Zhe
    Xue, Yujun
    DIGITAL SIGNAL PROCESSING, 2025, 158
  • [5] Anomaly Detection for In-Vehicle Network Using CNN-LSTM With Attention Mechanism
    Sun, Heng
    Chen, Miaomiao
    Weng, Jian
    Liu, Zhiquan
    Geng, Guanggang
    IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2021, 70 (10) : 10880 - 10893
  • [6] Driver stress detection via multimodal fusion using attention-based CNN-LSTM
    Mou, Luntian
    Zhou, Chao
    Zhao, Pengfei
    Nakisa, Bahareh
    Rastgoo, Mohammad Naim
    Jain, Ramesh
    Gao, Wen
    EXPERT SYSTEMS WITH APPLICATIONS, 2021, 173
  • [7] Massive MIMO CSI reconstruction using CNN-LSTM and attention mechanism
    Zhang, Zufan
    Zheng, Yue
    Gan, Chenquan
    Zhu, Qingyi
    IET COMMUNICATIONS, 2020, 14 (18) : 3089 - 3094
  • [8] An attention-based CNN-LSTM model for subjectivity detection in opinion-mining
    Santwana Sagnika
    Bhabani Shankar Prasad Mishra
    Saroj K. Meher
    Neural Computing and Applications, 2021, 33 : 17425 - 17438
  • [9] KianNet: A Violence Detection Model Using an Attention-Based CNN-LSTM Structure
    Vosta, Soheil
    Yow, Kin-Choong
    IEEE ACCESS, 2024, 12 : 2198 - 2209
  • [10] An attention-based CNN-LSTM model for subjectivity detection in opinion-mining
    Sagnika, Santwana
    Mishra, Bhabani Shankar Prasad
    Meher, Saroj K.
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (24) : 17425 - 17438