Power transformer fault diagnosis based on a self-strengthening offline pre-training model

被引:10
|
作者
Zhong, Mingwei [1 ]
Yi, Siqi [1 ]
Fan, Jingmin [1 ]
Zhang, Yikang [1 ]
He, Guanglin [1 ]
Cao, Yunfei [1 ]
Feng, Lutao [1 ]
Tan, Zhichao [1 ]
Mo, Wenjun [1 ]
机构
[1] Guangdong Univ Technol, Sch Automat, Guangzhou 510006, Peoples R China
基金
中国国家自然科学基金;
关键词
Power transformer fault diagnosis; Residual variational auto encoder; Ensemble learning; Self-strengthening strategy; Offline pre-training model; DISSOLVED-GAS ANALYSIS; NEURAL-NETWORK; FUZZY-LOGIC; CLASSIFIER; OPTIMIZER; SYSTEM;
D O I
10.1016/j.engappai.2023.107142
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Accurate transformer fault diagnosis is crucial for maintaining the power system stability. Due the complex operation condition of the transformer, its faults are with the characteristic of multi-class faults, class-imbalance, and limited diagnosis data of availability. Additionally, some fault samples are only with overheating or discharge labels when collected, it is a challenge that how to how to use these samples. To address these issues, in this paper, a novel transformer fault diagnosis method based on a hybrid model of Res-Variational-Auto-Encoder (ResVAE) and ensemble learning (EL) model is proposed. Through a self-strengthening strategy, fault characteristics are extracted category-by-category by using a residual convolutional neural network, and low dimensional characteristics are mapped into characteristic fusion samples by VAE. Based on this strategy, an offline pre-training model is built based on ResVAE and EL. The hybrid model can obtain more information from offline source domain, enabling the EL to diagnose multiple fault types as well as undetermined faults. Considering 11 categories of imbalanced classification scenarios with limited sample sizes, the comparison is made between eight expansion and six diagnosis algorithms. The results show that the offline pre-training EL model increased the diagnostic accuracy up to 11.224% compared with tradition ratios method. The ResVAE-EL model achieves the highest diagnostic accuracy of 91.011%, which is 10.112% higher than that of the single offline pre-training model.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] On the Effect of Pre-training for Transformer in Different Modality on Offline Reinforcement Learning
    Takagi, Shiro
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [2] Long Document Extractive Summarization Method Based on Pre-training Model and Transformer
    Zhou, Xinxin
    Guo, Yuechen
    Huang, Yuning
    Yan, Yuming
    Li, Maoyuan
    Journal of Network Intelligence, 2023, 8 (03): : 913 - 931
  • [3] DiT: Self-supervised Pre-training for Document Image Transformer
    Li, Junlong
    Xu, Yiheng
    Lv, Tengchao
    Cui, Lei
    Zhang, Cha
    Wei, Furu
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2022, 2022, : 3530 - 3539
  • [4] Pre-training a Transformer-Based Generative Model Using a Small Sepedi Dataset
    Ramalepe, Simon Phetole
    Modipa, Thipe I.
    Davel, Marelie H.
    ARTIFICIAL INTELLIGENCE RESEARCH, SACAIR 2024, 2025, 2326 : 319 - 333
  • [5] Fault diagnosis model for power transformer based on Bayesian network
    Wang, YQ
    Lu, FC
    Li, HM
    ICEMI 2005: CONFERENCE PROCEEDINGS OF THE SEVENTH INTERNATIONAL CONFERENCE ON ELECTRONIC MEASUREMENT & INSTRUMENTS, VOL 8, 2005, : 141 - 146
  • [6] Fault diagnosis of power transformer based on grey cloud model
    Cai, Hong-Mei
    Chen, Jian-Yong
    Su, Hao-Yi
    Dianli Xitong Baohu yu Kongzhi/Power System Protection and Control, 2012, 40 (12): : 151 - 155
  • [7] Fault diagnosis model of power transformer based on combinatorial KFDA
    Liang, Yongchun
    Sun, Xiaoyun
    Liu, Qingrui
    Bian, Hanpeng
    Li, Yanming
    PROCEEDINGS OF 2008 INTERNATIONAL CONFERENCE ON CONDITION MONITORING AND DIAGNOSIS, 2007, : 956 - +
  • [8] TRANSFORMER BASED UNSUPERVISED PRE-TRAINING FOR ACOUSTIC REPRESENTATION LEARNING
    Zhang, Ruixiong
    Wu, Haiwei
    Li, Wubo
    Jiang, Dongwei
    Zou, Wei
    Li, Xiangang
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 6933 - 6937
  • [9] Fault diagnosis model for power transformer based on statistical theory
    Zhao, Wen-Qing
    Zhu, Yong-Li
    Wang, De-Wen
    Zhai, Xue-Ming
    2007 INTERNATIONAL CONFERENCE ON WAVELET ANALYSIS AND PATTERN RECOGNITION, VOLS 1-4, PROCEEDINGS, 2007, : 962 - 966
  • [10] Survey: Transformer based video-language pre-training
    Ruan, Ludan
    Jin, Qin
    AI OPEN, 2022, 3 : 1 - 13