Knowledge Distillation and Multi-task Feature Learning for Partial Discharge Recognition

被引:0
|
作者
Ji, Jinsheng [1 ]
Shu, Zhou [1 ]
Li, Hongqun [2 ]
Lai, Kai Xian [3 ]
Zheng, Yuanjin [1 ]
Jiang, Xudong [1 ]
机构
[1] Nanyang Technol Univ, Sch Elect & Elect Engn, Singapore 639798, Singapore
[2] SP Grp, Grid Digitalisat, Singapore 349277, Singapore
[3] SP Grp, Asset Sensing&Analyt, Singapore 349277, Singapore
来源
2023 IEEE 32ND CONFERENCE ON ELECTRICAL PERFORMANCE OF ELECTRONIC PACKAGING AND SYSTEMS, EPEPS | 2023年
基金
新加坡国家研究基金会;
关键词
Knowledge distillation; Partial discharge; Pattern recognition; CONVOLUTIONAL NEURAL-NETWORK;
D O I
10.1109/EPEPS58208.2023.10314925
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
To achieve accurate detection and recognition of partial discharge (PD) in switchgear, developing an intelligent PD diagnosis system has garnered significant attention in recent years. Due to inevitable noise interference and high similarity of different PD signals, detecting and identifying PDs using a portable PD detector poses significant challenges. In this study, we aim to transfer the knowledge acquired by the large-scale network to a lightweight network for precise PD recognition. To achieve this, we employ a k-means clustering model to effectively separate signals originating from different sources, thereby obtaining Phase Resolved Partial Discharge (PRPD) patterns. Then, we introduce knowledge distillation and a multi-task feature learning framework to extract discriminative features from PRPD patterns. We conduct experiments and compare the proposed method against some state-of-the-art methods on our constructed PD recognition dataset to evaluate the superiority of the proposed method.
引用
收藏
页数:3
相关论文
共 50 条
  • [1] Online Knowledge Distillation for Multi-task Learning
    Jacob, Geethu Miriam
    Agarwal, Vishal
    Stenger, Bjorn
    2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 2358 - 2367
  • [2] Multi-Task Learning with Knowledge Distillation for Dense Prediction
    Xu, Yangyang
    Yang, Yibo
    Zhang, Lefei
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 21493 - 21502
  • [3] Application of Knowledge Distillation to Multi-Task Speech Representation Learning
    Kerpicci, Mine
    Van Nguyen
    Zhang, Shuhua
    Visser, Erik
    INTERSPEECH 2023, 2023, : 2813 - 2817
  • [4] On Partial Multi-Task Learning
    He, Yi
    Wu, Baijun
    Wu, Di
    Wu, Xindong
    ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, 325 : 1174 - 1181
  • [5] Multi-Task Feature Learning for Knowledge Graph Enhanced Recommendation
    Wang, Hongwei
    Zhang, Fuzheng
    Zhao, Miao
    Li, Wenjie
    Xie, Xing
    Guo, Minyi
    WEB CONFERENCE 2019: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW 2019), 2019, : 2000 - 2010
  • [6] Tomato leaf disease recognition based on multi-task distillation learning
    Liu, Bo
    Wei, Shusen
    Zhang, Fan
    Guo, Nawei
    Fan, Hongyu
    Yao, Wei
    FRONTIERS IN PLANT SCIENCE, 2024, 14
  • [7] MULTI-TASK DISTILLATION: TOWARDS MITIGATING THE NEGATIVE TRANSFER IN MULTI-TASK LEARNING
    Meng, Ze
    Yao, Xin
    Sun, Lifeng
    2021 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2021, : 389 - 393
  • [8] Cross-Task Knowledge Distillation in Multi-Task Recommendation
    Yang, Chenxiao
    Pan, Junwei
    Gao, Xiaofeng
    Jiang, Tingyu
    Liu, Dapeng
    Chen, Guihai
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 4318 - 4326
  • [9] Convex multi-task feature learning
    Andreas Argyriou
    Theodoros Evgeniou
    Massimiliano Pontil
    Machine Learning, 2008, 73 : 243 - 272
  • [10] Convex multi-task feature learning
    Argyriou, Andreas
    Evgeniou, Theodoros
    Pontil, Massimiliano
    MACHINE LEARNING, 2008, 73 (03) : 243 - 272