Domain Knowledge Distillation and Supervised Contrastive Learning for Industrial Process Monitoring

被引:11
|
作者
Ai, Mingxi [1 ,2 ]
Xie, Yongfang [1 ]
Ding, Steven X. X. [2 ]
Tang, Zhaohui [1 ]
Gui, Weihua [1 ]
机构
[1] Cent South Univ, Sch Automat, Changsha 410083, Peoples R China
[2] Univ Duisburg Essen, Inst Automat Control & Complex Syst, D-47057 Duisburg, Germany
关键词
Feature extraction; Process monitoring; Deep learning; Knowledge engineering; Convolutional neural networks; Task analysis; Reliability; Hard negative; industrial process monitoring; knowledge distillation; memory queue-based negative sample augmentation; supervised contrastive learning; HANDCRAFTED FEATURES; IMAGE; FLOTATION;
D O I
10.1109/TIE.2022.3206696
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
To ensure the reliability and safety of modern industrial process monitoring, computer vision-based soft measurement has received considerable attention due to its nonintrusive property. State-of-the-art computer vision-based approaches mostly rely on feature embedding from deep neural networks. However, this kind of feature extraction suffers from noise effects and limitation of labeled training instances, leading to unsatisfactory performance in real industrial process monitoring. In this article, we develop a novel hybrid learning framework for feature representation based on knowledge distillation and supervised contrastive learning. First, we attempt to transfer the abundant semantic information in handcrafted features to deep learning feature-based network by knowledge distillation. Then, to enhance the feature discrimination, supervised contrastive learning is proposed to contrast many positive pairs against many negative pairs per anchor. Meanwhile, two important mechanisms, memory queue-based negative sample augmentation and hard negative sampling, are added into the supervised contrastive learning model to assist the proper selection of negative samples. Finally, a flotation process monitoring problem is considered to illustrate and demonstrate the effectiveness of the proposed method.
引用
收藏
页码:9452 / 9462
页数:11
相关论文
共 50 条
  • [41] A Robust and Effective Text Detector Supervised by Contrastive Learning
    Wei, Ran
    Li, Yaoyi
    Li, Haiyan
    Tang, Ze
    Lu, Hongtao
    Cai, Nengbin
    Zhao, Xuejun
    IEEE ACCESS, 2021, 9 : 26431 - 26441
  • [42] Dynamic malware detection based on supervised contrastive learning
    Yang, Shumian
    Yang, Yongqi
    Zhao, Dawei
    Xu, Lijuan
    Li, Xin
    Yu, Fuqiang
    Hu, Jiarui
    COMPUTERS & ELECTRICAL ENGINEERING, 2025, 123
  • [43] Process Monitoring with Sparse Bayesian Model for Industrial Methanol Distillation
    Luo, Lin
    Xie, Lei
    Su, Hongye
    Zeng, Jiusun
    IFAC PAPERSONLINE, 2020, 53 (02): : 431 - 437
  • [44] SuperConText: Supervised Contrastive Learning Framework for Textual Representations
    Moukafih, Youness
    Sbihi, Nada
    Ghogho, Mounir
    Smaili, Kamel
    IEEE ACCESS, 2023, 11 : 16820 - 16830
  • [45] A Fusion of Supervised Contrastive Learning and Variational Quantum Classifiers
    Don, Asitha Kottahachchi Kankanamge
    Khalil, Ibrahim
    Atiquzzaman, Mohammed
    IEEE TRANSACTIONS ON CONSUMER ELECTRONICS, 2024, 70 (01) : 770 - 779
  • [46] Cross-Domain Contrastive Learning for Hyperspectral Image Classification
    Guan, Peiyan
    Lam, Edmund Y.
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
  • [47] Self-Supervised Learning With Adaptive Distillation for Hyperspectral Image Classification
    Yue, Jun
    Fang, Leyuan
    Rahmani, Hossein
    Ghamisi, Pedram
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
  • [48] DimCL: Dimensional Contrastive Learning for Improving Self-Supervised Learning
    Nguyen, Thanh
    Pham, Trung Xuan
    Zhang, Chaoning
    Luu, Tung M.
    Vu, Thang
    Yoo, Chang D.
    IEEE ACCESS, 2023, 11 : 21534 - 21545
  • [49] Feature-Domain Adaptive Contrastive Distillation for Efficient Single Image Super-Resolution
    Moon, Hyeon-Cheol
    Kim, Jae-Gon
    Jeong, Jinwoo
    Kim, Sungjei
    IEEE ACCESS, 2023, 11 : 131885 - 131896
  • [50] A Lightweight Method for Graph Neural Networks Based on Knowledge Distillation and Graph Contrastive Learning
    Wang, Yong
    Yang, Shuqun
    APPLIED SCIENCES-BASEL, 2024, 14 (11):