DLformer: A Dynamic Length Transformer-Based Network for Efficient Feature Representation in Remaining Useful Life Prediction

被引:25
|
作者
Ren, Lei [1 ,2 ]
Wang, Haiteng [1 ]
Huang, Gao [3 ]
机构
[1] Beihang Univ, Sch Automat Sci & Elect Engn, Beijing 100191, Peoples R China
[2] Zhongguancun Lab, Beijing 100094, Peoples R China
[3] Tsinghua Univ, Dept Automat, Beijing 100084, Peoples R China
基金
美国国家科学基金会;
关键词
Transformers; Feature extraction; Maintenance engineering; Time series analysis; Computational modeling; Adaptation models; Task analysis; Adaptive inference; deep learning; feature representation; interpretability; remaining useful life (RUL) prediction; PROGNOSTICS;
D O I
10.1109/TNNLS.2023.3257038
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Representation learning-based remaining useful life (RUL) prediction plays a crucial role in improving the security and reducing the maintenance cost of complex systems. Despite the superior performance, the high computational cost of deep networks hinders deploying the models on low-compute platforms. A significant reason for the high cost is the computation of representing long sequences. In contrast to most RUL prediction methods that learn features of the same sequence length, we consider that each time series has its characteristics and the sequence length should be adjusted adaptively. Our motivation is that an "easy" sample with representative characteristics can be correctly predicted even when short feature representation is provided, while "hard" samples need complete feature representation. Therefore, we focus on sequence length and propose a dynamic length transformer (DLformer) that can adaptively learn sequence representation of different lengths. Then, a feature reuse mechanism is developed to utilize previously learned features to reduce redundant computation. Finally, in order to achieve dynamic feature representation, a particular confidence strategy is designed to calculate the confidence level for the prediction results. Regarding interpretability, the dynamic architecture can help human understand which part of the model is activated. Experiments on multiple datasets show that DLformer can increase up to 90% inference speed, with less than 5% degradation in model accuracy.
引用
收藏
页码:5942 / 5952
页数:11
相关论文
共 50 条
  • [21] A Method for Remaining Useful Life Prediction and Uncertainty Quantification of Rolling Bearings Based on Fault Feature Gain
    Yang, Ningning
    Zhang, Wei
    Zhang, Jingqi
    Wang, Ke
    Su, Yin
    Liu, Yunpeng
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2025, 74
  • [22] PMBCT: The Probabilistic Multiscale Bayesian Convolutional Transformer for Trustworthy Remaining Useful Life Prediction
    Peng, Huachao
    Mao, Zehui
    Jiang, Bin
    IEEE TRANSACTIONS ON RELIABILITY, 2024,
  • [23] Remaining useful life prediction based on an integrated neural network
    Zhang Y.-F.
    Lu Z.-Q.
    Gongcheng Kexue Xuebao/Chinese Journal of Engineering, 2020, 42 (10): : 1372 - 1380
  • [24] A transformer with layer-cross decoding for remaining useful life prediction
    Guo, Peng
    Liu, Qi
    Yu, Shui
    Xiong, Jianyu
    Tan, Xiang
    Guo, Chao
    JOURNAL OF SUPERCOMPUTING, 2023, 79 (10) : 11558 - 11584
  • [25] Remaining Useful Life Prediction of Bearings Based on Convolution Attention Mechanism and Temporal Convolution Network
    Wang, Haitao
    Yang, Jie
    Wang, Ruihua
    Shi, Lichen
    IEEE ACCESS, 2023, 11 : 24407 - 24419
  • [26] A weighted time embedding transformer network for remaining useful life prediction of rolling bearing
    Zhang, Mingyuan
    He, Chen
    Huang, Chengxuan
    Yang, Jianhong
    RELIABILITY ENGINEERING & SYSTEM SAFETY, 2024, 251
  • [27] Dynamic Feature-Aware Graph Convolutional Network With Multisensor for Remaining Useful Life Prediction of Turbofan Engines
    Hua, Juntao
    Zhang, Yupeng
    Zhang, Dingcheng
    He, Jiayuan
    Wang, Jie
    Fang, Xia
    IEEE SENSORS JOURNAL, 2024, 24 (18) : 29414 - 29428
  • [28] Multi-Scale and Multi-Branch Transformer Network for Remaining Useful Life Prediction in Ion Mill Etching Process
    Yuan, Zengwei
    Wang, Rui
    IEEE TRANSACTIONS ON SEMICONDUCTOR MANUFACTURING, 2024, 37 (01) : 67 - 75
  • [29] Conditional variational transformer for bearing remaining useful life prediction
    Wei, Yupeng
    Wu, Dazhong
    ADVANCED ENGINEERING INFORMATICS, 2024, 59
  • [30] Remaining Useful Life Prediction Using a Novel Feature-Attention-Based End-to-End Approach
    Liu, Hui
    Liu, Zhenyu
    Jia, Weiqiang
    Lin, Xianke
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2021, 17 (02) : 1197 - 1207