ADCL: Adversarial Distilled Contrastive Learning on lightweight models for self-supervised image classification

被引:1
|
作者
Wu, Ran [1 ]
Liu, Huanyu [1 ]
Li, Jun-Bao [1 ]
机构
[1] Harbin Inst Technol, Sch Comp Sci, Yikuang St, Harbin 150001, Heilongjiang, Peoples R China
关键词
Adversarial distillation; Lightweight models; Self-supervised learning; KNOWLEDGE;
D O I
10.1016/j.knosys.2023.110824
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the development of modern sensors, numerous images are collected in edge application scenarios; however, their utilization is quite expensive because a massive effort is required to label them for further usage. Self-supervised learning, with no need for labeled data, shows great potential in this context; however, notable performance degradations can be observed when training lightweight networks which are essential in edge implementation. We propose an effective distillation method called Adversarial Distilled Contrastive Learning (ADCL) to mitigate this issue. Specifically, we introduced knowledge distillation into self-supervised learning to transfer underlying feature clustering relations from teacher models to shallow models. We adopted an online-updated rather than a pretrained teacher model to realize convenient implementation to specific data domains. An adversarial loss item was introduced to alleviate unstable optimization caused by an online trained teacher by forcing the teacher model to find feature relations beyond the recognition of the student model. Compared with other self-supervised knowledge distillation methods that maintain data queues consisting of positive and negative examples, the asymmetric contrastive learning method was employed to further relieve the memory bottleneck during training. The experimental results prove the effectiveness of our method. When ResNet-50 is used as a teacher to teach ResNet-18 on ImageNet, ADCL achieves top-1 accuracies of 60.3% , which surpasses other knowledge distillation methods with online teachers and is comparable to approaches using pretrained teachers and data queues. & COPY; 2023 Elsevier B.V. All rights reserved.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] DisCo: Remedying Self-supervised Learning on Lightweight Models with Distilled Contrastive Learning
    Gao, Yuting
    Zhuang, Jia-Xin
    Lin, Shaohui
    Cheng, Hao
    Sun, Xing
    Li, Ke
    Shen, Chunhua
    COMPUTER VISION, ECCV 2022, PT XXVI, 2022, 13686 : 237 - 253
  • [2] Adversarial Self-Supervised Contrastive Learning
    Kim, Minseon
    Tack, Jihoon
    Hwang, Sung Ju
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS (NEURIPS 2020), 2020, 33
  • [3] Image classification framework based on contrastive self-supervised learning
    Zhao H.-W.
    Zhang J.-R.
    Zhu J.-P.
    Li H.
    Jilin Daxue Xuebao (Gongxueban)/Journal of Jilin University (Engineering and Technology Edition), 2022, 52 (08): : 1850 - 1856
  • [4] Decoupled Adversarial Contrastive Learning for Self-supervised Adversarial Robustness
    Zhang, Chaoning
    Zhang, Kang
    Zhang, Chenshuang
    Niu, Axi
    Feng, Jiu
    Yoo, Chang D.
    Kweon, In So
    COMPUTER VISION - ECCV 2022, PT XXX, 2022, 13690 : 725 - 742
  • [5] Contrastive Self-supervised Learning for Graph Classification
    Zeng, Jiaqi
    Xie, Pengtao
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 10824 - 10832
  • [6] Pathological Image Contrastive Self-supervised Learning
    Qin, Wenkang
    Jiang, Shan
    Luo, Lin
    RESOURCE-EFFICIENT MEDICAL IMAGE ANALYSIS, REMIA 2022, 2022, 13543 : 85 - 94
  • [7] Contrastive Self-Supervised Learning Leads to Higher Adversarial Susceptibility
    Gupta, Rohit
    Akhtar, Naveed
    Mian, Ajmal
    Shah, Mubarak
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 12, 2023, : 14838 - 14846
  • [8] Self-Supervised Learning With Learnable Sparse Contrastive Sampling for Hyperspectral Image Classification
    Liang, Miaomiao
    Dong, Jian
    Yu, Lingjuan
    Yu, Xiangchun
    Meng, Zhe
    Jiao, Licheng
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2023, 61 : 1 - 13
  • [9] SCL: Self-supervised contrastive learning for few-shot image classification
    Lim, Jit Yan
    Lim, Kian Ming
    Lee, Chin Poo
    Tan, Yong Xuan
    NEURAL NETWORKS, 2023, 165 : 19 - 30
  • [10] Contrastive self-supervised learning for neurodegenerative disorder classification
    Gryshchuk, Vadym
    Singh, Devesh
    Teipel, Stefan
    Dyrba, Martin
    ADNI Study Grp
    AIBL Study Grp
    FTLDNI Study Grp
    FRONTIERS IN NEUROINFORMATICS, 2025, 19