共 33 条
Cross-attentional subdomain adaptation with selective knowledge distillation for motor fault diagnosis under variable working conditions
被引:1
作者:
Huang, Yixiang
[1
]
Zhang, Kaiwen
[1
]
Xia, Pengcheng
[1
]
Wang, Zhilin
[2
]
Li, Yanming
[1
]
Liu, Chengliang
[1
]
机构:
[1] Shanghai Jiao Tong Univ, State Key Lab Mech Syst & Vibrat, Shanghai, Peoples R China
[2] Shanghai Jiao Tong Univ, Sch Elect Informat & Elect Engn, Shanghai 200240, Peoples R China
基金:
中国国家自然科学基金;
关键词:
Motor fault diagnosis;
Subdomain adaptation;
Cross-attention mechanism;
Transformer;
Knowledge distillation;
NETWORK;
SPEED;
D O I:
10.1016/j.aei.2024.102948
中图分类号:
TP18 [人工智能理论];
学科分类号:
081104 ;
0812 ;
0835 ;
1405 ;
摘要:
Motor fault diagnosis under variable working conditions is an open challenge for practical application. Domain adaptation has been explored for reducing feature distribution discrepancy across working conditions. However, existing methods overlook the relations and the domain-related features among individual sample pairs across different domains, and the quality of pseudo labels significantly limits the subdomain adaptation performance. To tackle these limitations, a cross-attentional subdomain adaptation (CroAttSA) method with clustering-based selective knowledge distillation for motor fault diagnosis under variable working conditions is proposed. A triplebranch transformer with self-attention and cross-domain-attention is designed for domain-specific and domaincorrelated feature extraction. Additionally, a correlated local maximum mean discrepancy (CLMMD) loss is introduced for more fine-grained and fault-related subdomain adaptation. A clustering-based selective knowledge distillation strategy is also proposed to improve the quality of the pseudo labels for enhanced model performance. Extensive experiments on motor fault diagnosis under variable loads and rotating speeds are conducted, and the comparison and ablation study results have verified the model effectiveness.
引用
收藏
页数:13
相关论文
共 33 条