Supervised Domain Adaptation for Remaining Useful Life Prediction Based on AdaBoost With Long Short-Term Memory

被引:0
作者
Seo, Seunghwan [1 ,2 ]
Hwang, Jungwoo [3 ]
Chung, Moonkyung [1 ]
机构
[1] Korea Inst Civil Engn & Bldg Technol KICT, Dept Geotech Engn Res, Goyang Si 10223, South Korea
[2] Yonsei Univ, Dept Ind Engn, Seoul 03722, South Korea
[3] PenLab Co Ltd, Seoul 05838, South Korea
关键词
Long short term memory; Adaptation models; Data models; Predictive models; Transformers; Logic gates; Feature extraction; Life estimation; AdaBoost-LSTM; domain adaptation; remaining useful life prediction; transfer learning; CONVOLUTIONAL NEURAL-NETWORK; HEALTH PROGNOSTICS;
D O I
10.1109/ACCESS.2024.3426909
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
With the advent of high-quality deep learning algorithms, several methods have been proposed for addressing domain adaptation (DA) problems in remaining useful life prediction. The majority of these methods are unsupervised DA techniques, with popular adversarial approaches known to exhibit superior performance among them. However, we have identified that adversarial approaches have an instability problem; that is, their performance is highly dependent on the initial weights of the deep-learning network. Furthermore, unsupervised DA methods can only achieve limited performance improvements when the domain shift is substantial. To address these challenges, this study proposes a supervised DA method based on AdaBoost with long short-term memory (LSTM) as the base estimator. The proposed approach demonstrates effectiveness, particularly when the target-domain data are significantly smaller than the source-domain data. The proposed methodology was tested on a publicly accessible dataset, and when compared to previous unsupervised domain adaptation prediction methods, it achieved state-of-the-art prediction performance on most domains.
引用
收藏
页码:96757 / 96768
页数:12
相关论文
共 45 条
[1]   Efficient temporal flow Transformer accompanied with multi-head probsparse self-attention mechanism for remaining useful life prognostics [J].
Chang, Yuanhong ;
Li, Fudong ;
Chen, Jinglong ;
Liu, Yulang ;
Li, Zipeng .
RELIABILITY ENGINEERING & SYSTEM SAFETY, 2022, 226
[2]  
Chen A, 2024, SCI REP-UK, V14, DOI [10.1038/s41598-024-59095-3, DOI 10.1038/S41598-024-59095-3]
[3]   Machine Remaining Useful Life Prediction via an Attention-Based Deep Learning Approach [J].
Chen, Zhenghua ;
Wu, Min ;
Zhao, Rui ;
Guretno, Feri ;
Yan, Ruqiang ;
Li, Xiaoli .
IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 2021, 68 (03) :2521-2531
[4]   Remaining useful lifetime prediction via deep domain adaptation [J].
da Costa, Paulo Roberto de Oliveira ;
Akcay, Alp ;
Zhang, Yingqian ;
Kaymak, Uzay .
RELIABILITY ENGINEERING & SYSTEM SAFETY, 2020, 195
[5]  
Dai W., 2007, P ICML, P193, DOI [DOI 10.1145/1273496.1273521, 10.1145/1273496.1273521]
[6]  
Diksha P., 2019, ARXIV
[7]  
Drucker H., 1997, Icml, V97, P107, DOI DOI 10.5555/645526.657132
[8]   Cross-Domain Remaining Useful Life Prediction Based on Adversarial Training [J].
Duan, Yuhang ;
Xiao, Jie ;
Li, Honghui ;
Zhang, Jie .
MACHINES, 2022, 10 (06)
[9]   Transfer learning for remaining useful life prediction based on consensus self-organizing models [J].
Fan, Yuantao ;
Nowaczyk, Slawomir ;
Rognvaldsson, Thorsteinn .
RELIABILITY ENGINEERING & SYSTEM SAFETY, 2020, 203
[10]   A decision-theoretic generalization of on-line learning and an application to boosting [J].
Freund, Y ;
Schapire, RE .
JOURNAL OF COMPUTER AND SYSTEM SCIENCES, 1997, 55 (01) :119-139