DLformer: A Dynamic Length Transformer-Based Network for Efficient Feature Representation in Remaining Useful Life Prediction

被引:25
|
作者
Ren, Lei [1 ,2 ]
Wang, Haiteng [1 ]
Huang, Gao [3 ]
机构
[1] Beihang Univ, Sch Automat Sci & Elect Engn, Beijing 100191, Peoples R China
[2] Zhongguancun Lab, Beijing 100094, Peoples R China
[3] Tsinghua Univ, Dept Automat, Beijing 100084, Peoples R China
基金
美国国家科学基金会;
关键词
Transformers; Feature extraction; Maintenance engineering; Time series analysis; Computational modeling; Adaptation models; Task analysis; Adaptive inference; deep learning; feature representation; interpretability; remaining useful life (RUL) prediction; PROGNOSTICS;
D O I
10.1109/TNNLS.2023.3257038
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Representation learning-based remaining useful life (RUL) prediction plays a crucial role in improving the security and reducing the maintenance cost of complex systems. Despite the superior performance, the high computational cost of deep networks hinders deploying the models on low-compute platforms. A significant reason for the high cost is the computation of representing long sequences. In contrast to most RUL prediction methods that learn features of the same sequence length, we consider that each time series has its characteristics and the sequence length should be adjusted adaptively. Our motivation is that an "easy" sample with representative characteristics can be correctly predicted even when short feature representation is provided, while "hard" samples need complete feature representation. Therefore, we focus on sequence length and propose a dynamic length transformer (DLformer) that can adaptively learn sequence representation of different lengths. Then, a feature reuse mechanism is developed to utilize previously learned features to reduce redundant computation. Finally, in order to achieve dynamic feature representation, a particular confidence strategy is designed to calculate the confidence level for the prediction results. Regarding interpretability, the dynamic architecture can help human understand which part of the model is activated. Experiments on multiple datasets show that DLformer can increase up to 90% inference speed, with less than 5% degradation in model accuracy.
引用
收藏
页码:5942 / 5952
页数:11
相关论文
共 50 条
  • [1] Domain Adaptive Remaining Useful Life Prediction With Transformer
    Li, Xinyao
    Li, Jingjing
    Zuo, Lin
    Zhu, Lei
    Shen, Heng Tao
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2022, 71
  • [2] A new dual-channel transformer-based network for remaining useful life prediction
    Yang, Kai
    Wei, Yuxuan
    Ma, Yubao
    Huang, Lehong
    Tang, Qiang
    Li, Zhiguo
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2025, 36 (02)
  • [3] A Dual-Scale Transformer-Based Remaining Useful Life Prediction Model in Industrial Internet of Things
    Li, Junhuai
    Wang, Kan
    Hou, Xiangwang
    Lan, Dapeng
    Wu, Yunwen
    Wang, Huaijun
    Liu, Lei
    Mumtaz, Shahid
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (16): : 26656 - 26667
  • [4] Transformer Network for Remaining Useful Life Prediction of Lithium-Ion Batteries
    Chen, Daoquan
    Hong, Weicong
    Zhou, Xiuze
    IEEE ACCESS, 2022, 10 : 19621 - 19628
  • [5] Transformer-based hierarchical latent space VAE for interpretable remaining useful life prediction
    Jing, Tao
    Zheng, Pai
    Xia, Liqiao
    Liu, Tianyuan
    ADVANCED ENGINEERING INFORMATICS, 2022, 54
  • [6] Transformer-based novel framework for remaining useful life prediction of lubricant in operational rolling bearings
    Kim, Sunghyun
    Seo, Yun-Ho
    Park, Junhong
    RELIABILITY ENGINEERING & SYSTEM SAFETY, 2024, 251
  • [7] Dual Siamese transformer-encoder-based network for remaining useful life prediction
    Lin, Ching-Sheng
    JOURNAL OF SUPERCOMPUTING, 2024, 80 (17) : 25424 - 25449
  • [8] Multiscale Feature Extension Enhanced Deep Global-Local Attention Network for Remaining Useful Life Prediction
    Li, Rourou
    Jiang, Yimin
    Xia, Tangbin
    Wang, Dong
    Chen, Zhen
    Pan, Ershun
    Xi, Lifeng
    IEEE SENSORS JOURNAL, 2023, 23 (20) : 25557 - 25571
  • [9] A Novel Combination Neural Network Based on ConvLSTM-Transformer for Bearing Remaining Useful Life Prediction
    Deng, Feiyue
    Chen, Zhe
    Liu, Yongqiang
    Yang, Shaopu
    Hao, Rujiang
    Lyu, Litong
    MACHINES, 2022, 10 (12)
  • [10] Effective Latent Representation for Prediction of Remaining Useful Life
    Wang, Qihang
    Wu, Gang
    COMPUTER SYSTEMS SCIENCE AND ENGINEERING, 2021, 36 (01): : 225 - 237