Representation Learning and Knowledge Distillation for Lightweight Domain Adaptation

被引:1
作者
Bin Shah, Sayed Rafay [1 ]
Putty, Shreyas Subhash [1 ]
Schwung, Andreas [1 ]
机构
[1] South Westphalia Univ Appl Sci, Dept Elect Power Engn, Soest, Germany
来源
2024 IEEE CONFERENCE ON ARTIFICIAL INTELLIGENCE, CAI 2024 | 2024年
关键词
Unsupervised domain adaptation; maximum mean discrepancy; knowledge distillation; representation learning; remaining useful lifetime estimation; C-MAPSS;
D O I
10.1109/CAI59869.2024.00214
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In industrial machine learning applications, insufficient data, lack of labeling, distribution shift between subsets, varying operational conditions, etc. result in poor generalizing performance by pre-trained neural network models across domains. In contrast to image detection tasks, time series dataset contain critical domain-specific characteristics that must be learned by the corresponding networks. Naively aligning the learned representations during the adaptation process increases the risk of losing these key information, thus resulting in poor performance. This paper proposes a lightweight domain adaptation method involving representation learning and knowledge distillation (RepLKD). A separate network is pre-trained to learn valuable information from the target data in its latent space with the help of a reconstructor. In the adaptation stage, we use maximum mean discrepancy to minimize the difference in distributions between the source and target latent space. Additionally, we implement knowledge distillation to encourage the target network to generate source-like latent embedding and penalize only when an upper-bound condition is not fulfilled to prevent over-regularization and loss of domain-specific features. Finally, we test our proposed method on 12 cross-domain scenarios with the C-MAPSS dataset and compare the efficacy of our method against existing literature methods.
引用
收藏
页码:1202 / 1207
页数:6
相关论文
共 24 条
[1]   Remaining useful lifetime prediction via deep domain adaptation [J].
da Costa, Paulo Roberto de Oliveira ;
Akcay, Alp ;
Zhang, Yingqian ;
Kaymak, Uzay .
RELIABILITY ENGINEERING & SYSTEM SAFETY, 2020, 195
[2]   Transfer Learning for Remaining Useful Life Prediction Across Operating Conditions Based on Multisource Domain Adaptation [J].
Ding, Yifei ;
Ding, Peng ;
Zhao, Xiaoli ;
Cao, Yudong ;
Jia, Minping .
IEEE-ASME TRANSACTIONS ON MECHATRONICS, 2022, 27 (05) :4143-4152
[3]  
Ganin Y, 2016, J MACH LEARN RES, V17
[4]  
Ganin Y, 2015, PR MACH LEARN RES, V37, P1180
[5]   Deep Reconstruction-Classification Networks for Unsupervised Domain Adaptation [J].
Ghifary, Muhammad ;
Kleijn, W. Bastiaan ;
Zhang, Mengjie ;
Balduzzi, David ;
Li, Wen .
COMPUTER VISION - ECCV 2016, PT IV, 2016, 9908 :597-613
[6]  
Ghifary M, 2014, LECT NOTES ARTIF INT, V8862, P898, DOI 10.1007/978-3-319-13560-1_76
[7]   Reconstruction Domain Adaptation Transfer Network for Partial Transfer Learning of Machinery Fault Diagnostics [J].
Guo, Liang ;
Yu, Yaoxiang ;
Liu, Yuekai ;
Gao, Hongli ;
Chen, Tao .
IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2022, 71
[8]  
Hoffman J, 2018, PR MACH LEARN RES, V80
[9]  
Joseph M, 2022, Arxiv, DOI arXiv:2208.08388
[10]   Contrastive Adaptation Network for Unsupervised Domain Adaptation [J].
Kang, Guoliang ;
Jiang, Lu ;
Yang, Yi ;
Hauptmann, Alexander G. .
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, :4888-4897