Relation-Aware Attentive Neural Processes Model for Remaining Useful Life Prediction

被引:1
|
作者
Lu, Junzhong [1 ]
Cheng, Changming [1 ]
Zhao, Baoxuan [1 ]
Peng, Zhike [1 ]
机构
[1] Shanghai Jiao Tong Univ, State Key Lab Mech Syst & Vibrat, Shanghai 200240, Peoples R China
基金
中国国家自然科学基金; 上海市自然科学基金;
关键词
Predictive models; Computational modeling; Data models; Load modeling; Feature extraction; Task analysis; Context modeling; Attention mechanism; attentive neural processes (ANPs); deep learning; machine prognostic; remaining useful life (RUL);
D O I
10.1109/TIM.2022.3204089
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
For the remaining useful life (RUL) prediction task, the temporal information under the acquired data is crucial for prediction accuracy. The typical deep learning model for RUL prediction is mainly achieved by the recurrent neural network (RNN) or convolutional neural network (CNN) with a time window. However, the CNN model cannot explicitly extract the global temporal information of the sequence. The RNN model suffers from the slow forward computation speed due to its sequential structure. This article proposes the relation-aware attentive neural processes (R-ANPs) model to solve the RUL prediction problem. The local relation-aware self-attention model first processes the input time series, which explicitly fuses the local temporal information between adjacent sequence points. Then, the RUL is obtained by feeding the processed data into the attentive neural processes (ANPs) model. The relation-aware self-attention model and ANPs model can be trained and inferred in parallel to accelerate the training process. The proposed R-ANPs model has three innovative advantages: 1) the relation-aware self-attention model is used to fuse local temporal information into each data point explicitly; 2) the output of the model contains the standard deviation, which offers the uncertainty measurement for prediction results; and 3) the training data are served as context for the ANPs model to exploit their valuable information at the prediction stage. The effectiveness of the proposed model is validated on a run-to-failure dataset. The results demonstrate that the proposed model outperforms recent RNN- and CNN-based models.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] A Continuous Remaining Useful Life Prediction Method With Multistage Attention Convolutional Neural Network and Knowledge Weight Constraint
    Zhou, Jianghong
    Qin, Yi
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024,
  • [2] Prediction of State-of-Health and Remaining-Useful-Life of Battery Based on Hybrid Neural Network Model
    Thi Minh Lien, Le
    Quoc Anh, Vu
    Duc Tuyen, Nguyen
    Fujita, Goro
    IEEE ACCESS, 2024, 12 : 129022 - 129039
  • [3] An Adaptive Levy Process Model for Remaining Useful Life Prediction
    Wen Bincheng
    Xiao Mingqing
    Tang Xilang
    Li Jianfeng
    Zhu Haizhen
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2024, 73 : 1 - 10
  • [4] Multibranch Horizontal Augmentation Network for Continuous Remaining Useful Life Prediction
    Zhou, Jianghong
    Luo, Jun
    Pu, Huayan
    Qin, Yi
    IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2025, 55 (03): : 2237 - 2249
  • [5] Domain Adaptive Remaining Useful Life Prediction With Transformer
    Li, Xinyao
    Li, Jingjing
    Zuo, Lin
    Zhu, Lei
    Shen, Heng Tao
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2022, 71
  • [6] Deep Bidirectional Recurrent Neural Networks Ensemble for Remaining Useful Life Prediction of Aircraft Engine
    Hu, Kui
    Cheng, Yiwei
    Wu, Jun
    Zhu, Haiping
    Shao, Xinyu
    IEEE TRANSACTIONS ON CYBERNETICS, 2023, 53 (04) : 2531 - 2543
  • [7] A Wiener Process Model With Dynamic Covariate for Degradation Modeling and Remaining Useful Life Prediction
    Zhang, Shuyi
    Zhai, Qingqing
    Shi, Xin
    Liu, Xuejuan
    IEEE TRANSACTIONS ON RELIABILITY, 2023, 72 (01) : 214 - 223
  • [8] A Novel Cap-LSTM Model for Remaining Useful Life Prediction
    Zhao, Chengying
    Huang, Xianzhen
    Li, Yuxiong
    Li, Shangjie
    IEEE SENSORS JOURNAL, 2021, 21 (20) : 23498 - 23509
  • [9] A Dual-Scale Transformer-Based Remaining Useful Life Prediction Model in Industrial Internet of Things
    Li, Junhuai
    Wang, Kan
    Hou, Xiangwang
    Lan, Dapeng
    Wu, Yunwen
    Wang, Huaijun
    Liu, Lei
    Mumtaz, Shahid
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (16): : 26656 - 26667
  • [10] Degradation-Trend-Aware Deep Neural Network with Attention Mechanism for Bearing Remaining Useful Life Prediction
    Liu Y.
    Pan D.
    Zhang H.
    Zhong K.
    IEEE Transactions on Artificial Intelligence, 2024, 5 (06): : 2997 - 3011