DLformer: A Dynamic Length Transformer-Based Network for Efficient Feature Representation in Remaining Useful Life Prediction

被引:25
|
作者
Ren, Lei [1 ,2 ]
Wang, Haiteng [1 ]
Huang, Gao [3 ]
机构
[1] Beihang Univ, Sch Automat Sci & Elect Engn, Beijing 100191, Peoples R China
[2] Zhongguancun Lab, Beijing 100094, Peoples R China
[3] Tsinghua Univ, Dept Automat, Beijing 100084, Peoples R China
基金
美国国家科学基金会;
关键词
Transformers; Feature extraction; Maintenance engineering; Time series analysis; Computational modeling; Adaptation models; Task analysis; Adaptive inference; deep learning; feature representation; interpretability; remaining useful life (RUL) prediction; PROGNOSTICS;
D O I
10.1109/TNNLS.2023.3257038
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Representation learning-based remaining useful life (RUL) prediction plays a crucial role in improving the security and reducing the maintenance cost of complex systems. Despite the superior performance, the high computational cost of deep networks hinders deploying the models on low-compute platforms. A significant reason for the high cost is the computation of representing long sequences. In contrast to most RUL prediction methods that learn features of the same sequence length, we consider that each time series has its characteristics and the sequence length should be adjusted adaptively. Our motivation is that an "easy" sample with representative characteristics can be correctly predicted even when short feature representation is provided, while "hard" samples need complete feature representation. Therefore, we focus on sequence length and propose a dynamic length transformer (DLformer) that can adaptively learn sequence representation of different lengths. Then, a feature reuse mechanism is developed to utilize previously learned features to reduce redundant computation. Finally, in order to achieve dynamic feature representation, a particular confidence strategy is designed to calculate the confidence level for the prediction results. Regarding interpretability, the dynamic architecture can help human understand which part of the model is activated. Experiments on multiple datasets show that DLformer can increase up to 90% inference speed, with less than 5% degradation in model accuracy.
引用
收藏
页码:5942 / 5952
页数:11
相关论文
共 50 条
  • [31] Compact Convolutional Transformer for Bearing Remaining Useful Life Prediction
    Jin, Zhongtian
    Chen, Chong
    Liu, Qingtao
    Syntetos, Aris
    Liu, Ying
    ADVANCES IN REMANUFACTURING, IWAR 2023, 2024, : 227 - 238
  • [32] A Deep Branched Network for Failure Mode Diagnostics and Remaining Useful Life Prediction
    Li, Zhen
    Li, Yongxiang
    Yue, Xiaowei
    Zio, Enrico
    Wu, Jianguo
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2022, 71
  • [33] Graph Transformer-Based Dynamic Edge Interaction Encoding for Traffic Prediction
    Ouyang, Nan
    Ao, Lei
    Cai, Qing
    Wan, Wenkang
    Ren, Xiaojiang
    He, Xin
    Sheng, Kai
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2025, 26 (03) : 4066 - 4079
  • [34] Remaining Useful Life Prediction Method Based on Dual-Path Interaction Network with Multiscale Feature Fusion and Dynamic Weight Adaptation
    Lu, Zhe
    Li, Bing
    Fu, Changyu
    Wu, Junbao
    Xu, Liang
    Jia, Siye
    Zhang, Hao
    ACTUATORS, 2024, 13 (10)
  • [35] Bi-LSTM-Based Two-Stream Network for Machine Remaining Useful Life Prediction
    Jin, Ruibing
    Chen, Zhenghua
    Wu, Keyu
    Wu, Min
    Li, Xiaoli
    Yan, Ruqiang
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2022, 71
  • [36] Feature Fusion based Ensemble Method for remaining useful life prediction of machinery
    Wang, Gang
    Li, Hui
    Zhang, Feng
    Wu, Zhangjun
    APPLIED SOFT COMPUTING, 2022, 129
  • [37] Remaining Useful Life Prediction of Aeroengine Based on Fusion Neural Network
    Li J.
    Jia Y.-J.
    Zhang Z.-X.
    Li R.-R.
    Tuijin Jishu/Journal of Propulsion Technology, 2021, 42 (08): : 1725 - 1734
  • [38] Distributed Attention-Based Temporal Convolutional Network for Remaining Useful Life Prediction
    Song, Yan
    Gao, Shengyao
    Li, Yibin
    Jia, Lei
    Li, Qiqiang
    Pang, Fuzhen
    IEEE INTERNET OF THINGS JOURNAL, 2021, 8 (12): : 9594 - 9602
  • [39] Attention-Based LSTM Network for Rotatory Machine Remaining Useful Life Prediction
    Zhang, Hao
    Zhang, Qiang
    Shao, Siyu
    Niu, Tianlin
    Yang, Xinyu
    IEEE ACCESS, 2020, 8 (08): : 132188 - 132199
  • [40] Multitask Learning-Based Self-Attention Encoding Atrous Convolutional Neural Network for Remaining Useful Life Prediction
    Wang, Huaqing
    Lin, Tianjiao
    Cui, Lingli
    Ma, Bo
    Dong, Zuoyi
    Song, Liuyang
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2022, 71