A Bidirectional Long Short-Term Memory Autoencoder Transformer for Remaining Useful Life Estimation

被引:12
作者
Fan, Zhengyang [1 ]
Li, Wanru [1 ]
Chang, Kuo-Chu [1 ]
机构
[1] George Mason Univ, Dept Syst Engn & Operat Res, Fairfax, VA 22030 USA
关键词
Transformer; self-supervised learning; autoencoder; remaining useful life prediction; bidirectional LSTM; turbofan engine; PREDICTION;
D O I
10.3390/math11244972
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Estimating the remaining useful life (RUL) of aircraft engines holds a pivotal role in enhancing safety, optimizing operations, and promoting sustainability, thus being a crucial component of modern aviation management. Precise RUL predictions offer valuable insights into an engine's condition, enabling informed decisions regarding maintenance and crew scheduling. In this context, we propose a novel RUL prediction approach in this paper, harnessing the power of bi-directional LSTM and Transformer architectures, known for their success in sequence modeling, such as natural languages. We adopt the encoder part of the full Transformer as the backbone of our framework, integrating it with a self-supervised denoising autoencoder that utilizes bidirectional LSTM for improved feature extraction. Within our framework, a sequence of multivariate time-series sensor measurements serves as the input, initially processed by the bidirectional LSTM autoencoder to extract essential features. Subsequently, these feature values are fed into our Transformer encoder backbone for RUL prediction. Notably, our approach simultaneously trains the autoencoder and Transformer encoder, different from the naive sequential training method. Through a series of numerical experiments carried out on the C-MAPSS datasets, we demonstrate that the efficacy of our proposed models either surpasses or stands on par with that of other existing methods.
引用
收藏
页数:17
相关论文
共 58 条
[1]   Sequence-to-Sequence Remaining Useful Life Prediction of the Highly Maneuverable Unmanned Aerial Vehicle: A Multilevel Fusion Transformer Network Solution [J].
Ai, Shaojie ;
Song, Jia ;
Cai, Guobiao .
MATHEMATICS, 2022, 10 (10)
[2]  
[Anonymous], 2008, P 25 INT C MACH LEAR, DOI DOI 10.1145/1390156.1390294
[3]   Health assessment and life prediction of cutting tools based on support vector regression [J].
Benkedjouh, T. ;
Medjaher, K. ;
Zerhouni, N. ;
Rechak, S. .
JOURNAL OF INTELLIGENT MANUFACTURING, 2015, 26 (02) :213-223
[4]   Shared Temporal Attention Transformer for Remaining Useful Lifetime Estimation [J].
Chadha, Gavneet Singh ;
Bin Shah, Sayed Rafay ;
Schwung, Andreas ;
Ding, Steven X. .
IEEE ACCESS, 2022, 10 :74244-74258
[5]   Transformer Network for Remaining Useful Life Prediction of Lithium-Ion Batteries [J].
Chen, Daoquan ;
Hong, Weicong ;
Zhou, Xiuze .
IEEE ACCESS, 2022, 10 :19621-19628
[6]   Deep-Learning-Based Remaining Useful Life Prediction Based on a Multi-Scale Dilated Convolution Network [J].
Deng, Feiyue ;
Bi, Yan ;
Liu, Yongqiang ;
Yang, Shaopu .
MATHEMATICS, 2021, 9 (23)
[7]   An Elastic Expandable Fault Diagnosis Method of Three-Phase Motors Using Continual Learning for Class-Added Sample Accumulations [J].
Ding, Ao ;
Qin, Yong ;
Wang, Biao ;
Cheng, Xiaoqing ;
Jia, Limin .
IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 2024, 71 (07) :7896-7905
[8]   Convolutional Transformer: An Enhanced Attention Mechanism Architecture for Remaining Useful Life Estimation of Bearings [J].
Ding, Yifei ;
Jia, Minping .
IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2022, 71
[9]  
Dosovitskiy A, 2021, Arxiv, DOI arXiv:2010.11929
[10]   Remaining Useful Life Estimation of Turbofan Engines with Deep Learning Using Change-Point Detection Based Labeling and Feature Engineering [J].
Ensarioglu, Kiymet ;
Inkaya, Tulin ;
Emel, Erdal .
APPLIED SCIENCES-BASEL, 2023, 13 (21)