Domain Knowledge Distillation and Supervised Contrastive Learning for Industrial Process Monitoring

被引:11
|
作者
Ai, Mingxi [1 ,2 ]
Xie, Yongfang [1 ]
Ding, Steven X. X. [2 ]
Tang, Zhaohui [1 ]
Gui, Weihua [1 ]
机构
[1] Cent South Univ, Sch Automat, Changsha 410083, Peoples R China
[2] Univ Duisburg Essen, Inst Automat Control & Complex Syst, D-47057 Duisburg, Germany
关键词
Feature extraction; Process monitoring; Deep learning; Knowledge engineering; Convolutional neural networks; Task analysis; Reliability; Hard negative; industrial process monitoring; knowledge distillation; memory queue-based negative sample augmentation; supervised contrastive learning; HANDCRAFTED FEATURES; IMAGE; FLOTATION;
D O I
10.1109/TIE.2022.3206696
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
To ensure the reliability and safety of modern industrial process monitoring, computer vision-based soft measurement has received considerable attention due to its nonintrusive property. State-of-the-art computer vision-based approaches mostly rely on feature embedding from deep neural networks. However, this kind of feature extraction suffers from noise effects and limitation of labeled training instances, leading to unsatisfactory performance in real industrial process monitoring. In this article, we develop a novel hybrid learning framework for feature representation based on knowledge distillation and supervised contrastive learning. First, we attempt to transfer the abundant semantic information in handcrafted features to deep learning feature-based network by knowledge distillation. Then, to enhance the feature discrimination, supervised contrastive learning is proposed to contrast many positive pairs against many negative pairs per anchor. Meanwhile, two important mechanisms, memory queue-based negative sample augmentation and hard negative sampling, are added into the supervised contrastive learning model to assist the proper selection of negative samples. Finally, a flotation process monitoring problem is considered to illustrate and demonstrate the effectiveness of the proposed method.
引用
收藏
页码:9452 / 9462
页数:11
相关论文
共 50 条
  • [11] A Prior-Knowledge-Guided Neural Network Based on Supervised Contrastive Learning for Radar HRRP Recognition
    Liu, Qi
    Zhang, Xinyu
    Liu, Yongxiang
    IEEE TRANSACTIONS ON AEROSPACE AND ELECTRONIC SYSTEMS, 2024, 60 (03) : 2854 - 2873
  • [12] Supervised Contrastive Learning With Structure Inference for Graph Classification
    Ji, Junzhong
    Jia, Hao
    Ren, Yating
    Lei, Minglong
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2023, 10 (03): : 1684 - 1695
  • [13] Semisupervised Contrastive Memory Network for Industrial Process Working Condition Monitoring
    Tang, Zhaohui
    Zhang, Jin
    Xie, Yongfang
    Ding, Steven X.
    Ai, Mingxi
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2023, 72
  • [14] Balanced Knowledge Distillation with Contrastive Learning for Document Re-ranking
    Yang, Yingrui
    He, Shanxiu
    Qiao, Yifan
    Xie, Wentai
    Yang, Tao
    PROCEEDINGS OF THE 2023 ACM SIGIR INTERNATIONAL CONFERENCE ON THE THEORY OF INFORMATION RETRIEVAL, ICTIR 2023, 2023, : 247 - 255
  • [15] Progressively Balanced Supervised Contrastive Representation Learning for Long-Tailed Fault Diagnosis
    Peng, Peng
    Lu, Jiaxun
    Tao, Shuting
    Ma, Ke
    Zhang, Yi
    Wang, Hongwei
    Zhang, Heming
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2022, 71
  • [16] Semi-supervised contrastive learning for flotation process monitoring with uncertainty-aware prototype optimization
    Ai, Mingxi
    Zhang, Jin
    Li, Peng
    Wu, Jiande
    Tang, Zhaohui
    Xie, Yongfang
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2025, 145
  • [17] Semi-Supervised Image Deraining Using Knowledge Distillation
    Cui, Xin
    Wang, Cong
    Ren, Dongwei
    Chen, Yunjin
    Zhu, Pengfei
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2022, 32 (12) : 8327 - 8341
  • [18] Spectral-Spatial Masked Transformer With Supervised and Contrastive Learning for Hyperspectral Image Classification
    Huang, Lingbo
    Chen, Yushi
    He, Xin
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2023, 61
  • [19] Knowledge Distillation-Based Domain-Invariant Representation Learning for Domain Generalization
    Niu, Ziwei
    Yuan, Junkun
    Ma, Xu
    Xu, Yingying
    Liu, Jing
    Chen, Yen-Wei
    Tong, Ruofeng
    Lin, Lanfen
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 245 - 255
  • [20] Self-supervised knowledge distillation for complementary label learning
    Liu, Jiabin
    Li, Biao
    Lei, Minglong
    Shi, Yong
    NEURAL NETWORKS, 2022, 155 : 318 - 327