Knowledge Distillation and Multi-task Feature Learning for Partial Discharge Recognition

被引:0
|
作者
Ji, Jinsheng [1 ]
Shu, Zhou [1 ]
Li, Hongqun [2 ]
Lai, Kai Xian [3 ]
Zheng, Yuanjin [1 ]
Jiang, Xudong [1 ]
机构
[1] Nanyang Technol Univ, Sch Elect & Elect Engn, Singapore 639798, Singapore
[2] SP Grp, Grid Digitalisat, Singapore 349277, Singapore
[3] SP Grp, Asset Sensing&Analyt, Singapore 349277, Singapore
来源
2023 IEEE 32ND CONFERENCE ON ELECTRICAL PERFORMANCE OF ELECTRONIC PACKAGING AND SYSTEMS, EPEPS | 2023年
基金
新加坡国家研究基金会;
关键词
Knowledge distillation; Partial discharge; Pattern recognition; CONVOLUTIONAL NEURAL-NETWORK;
D O I
10.1109/EPEPS58208.2023.10314925
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
To achieve accurate detection and recognition of partial discharge (PD) in switchgear, developing an intelligent PD diagnosis system has garnered significant attention in recent years. Due to inevitable noise interference and high similarity of different PD signals, detecting and identifying PDs using a portable PD detector poses significant challenges. In this study, we aim to transfer the knowledge acquired by the large-scale network to a lightweight network for precise PD recognition. To achieve this, we employ a k-means clustering model to effectively separate signals originating from different sources, thereby obtaining Phase Resolved Partial Discharge (PRPD) patterns. Then, we introduce knowledge distillation and a multi-task feature learning framework to extract discriminative features from PRPD patterns. We conduct experiments and compare the proposed method against some state-of-the-art methods on our constructed PD recognition dataset to evaluate the superiority of the proposed method.
引用
收藏
页数:3
相关论文
共 50 条
  • [21] Multi-Stage Multi-Task Feature Learning
    Gong, Pinghua
    Ye, Jieping
    Zhang, Changshui
    JOURNAL OF MACHINE LEARNING RESEARCH, 2013, 14 : 2979 - 3010
  • [22] Multi-stage multi-task feature learning
    Gong, Pinghua
    Ye, Jieping
    Zhang, Changshui
    Journal of Machine Learning Research, 2013, 14 : 2979 - 3010
  • [23] Asynchronous Convergence in Multi-Task Learning via Knowledge Distillation from Converged Tasks
    Lu, Weiyi
    Rajagopalan, Sunny
    Nigam, Priyanka
    Singh, Jaspreet
    Sun, Xiaodi
    Xu, Yi
    Zeng, Belinda
    Chilimbi, Trishul
    2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, NAACL-HLT 2022, 2022, : 149 - 159
  • [24] Music recommendation algorithms based on knowledge graph and multi-task feature learning
    Liu, Xinqiao
    Yang, Zhisheng
    Cheng, Jinyong
    SCIENTIFIC REPORTS, 2024, 14 (01)
  • [25] Multi-task Attribute Joint Feature Learning
    Chang, Lu
    Fang, Yuchun
    Jiang, Xiaoda
    BIOMETRIC RECOGNITION, CCBR 2015, 2015, 9428 : 193 - 200
  • [26] Sequential Cooperative Distillation for Imbalanced Multi-Task Learning
    Feng, Quan
    Yao, Jia-Yu
    Xie, Ming-Kun
    Huang, Sheng-Jun
    Chen, Song-Can
    JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY, 2024, 39 (05) : 1094 - 1106
  • [27] Prototype Feature Extraction for Multi-task Learning
    Xin, Shen
    Jiao, Yuhang
    Long, Cheng
    Wang, Yuguang
    Wang, Xiaowei
    Yang, Sen
    Liu, Ji
    Zhang, Jie
    PROCEEDINGS OF THE ACM WEB CONFERENCE 2022 (WWW'22), 2022, : 2472 - 2481
  • [28] Multi-task Feature Learning for Social Recommendation
    Zhang, Yuanyuan
    Sun, Maosheng
    Zhang, Xiaowei
    Zhang, Yonglong
    KNOWLEDGE GRAPH AND SEMANTIC COMPUTING: KNOWLEDGE GRAPH EMPOWERS NEW INFRASTRUCTURE CONSTRUCTION, 2021, 1466 : 240 - 252
  • [29] Deep Asymmetric Multi-task Feature Learning
    Lee, Hae Beom
    Yang, Eunho
    Hwang, Sung Ju
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [30] Multi-Task Model and Feature Joint Learning
    Li, Ya
    Tian, Xinmei
    Liu, Tongliang
    Tao, Dacheng
    PROCEEDINGS OF THE TWENTY-FOURTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE (IJCAI), 2015, : 3643 - 3649