Image classification based on self-distillation

被引:2
|
作者
Li, Yuting [1 ]
Qing, Linbo [1 ]
He, Xiaohai [1 ]
Chen, Honggang [1 ]
Liu, Qiang [1 ]
机构
[1] Sichuan Univ, Coll Elect & Informat Engn, 24 South Sect 1,Yihuan Rd, Chengdu 610065, Peoples R China
基金
中国国家自然科学基金;
关键词
Image classification; Self-distillation; Attention; FUSION;
D O I
10.1007/s10489-022-04008-y
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Convolutional neural networks have been widely used in various application scenarios. To extend the application to some areas where accuracy is critical, researchers have been investigating methods to improve accuracy using deeper or broader network structures, which creates exponential growth in computation and storage costs and delays in response time. In this paper, we propose a self-distillation image classification algorithm that significantly improves performance while decreasing training costs. In traditional self-distillation, the student model needs to improve its ability to acquire global information and focus on key features due to the lack of guidance from the teacher model. For this reason, we improved the traditional self-distillation algorithm by using a positional attention module and a residual block with attention. Experimental results show that the method achieves better performance compared with traditional knowledge distillation methods and attention networks.
引用
收藏
页码:9396 / 9408
页数:13
相关论文
共 50 条
  • [1] Image classification based on self-distillation
    Yuting Li
    Linbo Qing
    Xiaohai He
    Honggang Chen
    Qiang Liu
    Applied Intelligence, 2023, 53 : 9396 - 9408
  • [2] Tolerant Self-Distillation for image classification
    Liu, Mushui
    Yu, Yunlong
    Ji, Zhong
    Han, Jungong
    Zhang, Zhongfei
    NEURAL NETWORKS, 2024, 174
  • [3] SIMPLE SELF-DISTILLATION LEARNING FOR NOISY IMAGE CLASSIFICATION
    Sasaya, Tenta
    Watanabe, Takashi
    Ida, Takashi
    Ono, Toshiyuki
    2023 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2023, : 795 - 799
  • [4] A Self-distillation Lightweight Image Classification Network Scheme
    Ni S.
    Ma X.
    Beijing Youdian Daxue Xuebao/Journal of Beijing University of Posts and Telecommunications, 2023, 46 (06): : 66 - 71
  • [5] Hyperspectral Image Classification Based on Pyramid Coordinate Attention and Weighted Self-Distillation
    Shang, Ronghua
    Ren, Jinhong
    Zhu, Songling
    Zhang, Weitong
    Feng, Jie
    Li, Yangyang
    Jiao, Licheng
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
  • [6] Masked Self-Distillation Domain Adaptation for Hyperspectral Image Classification
    Fang, Zhuoqun
    He, Wenqiang
    Li, Zhaokui
    Du, Qian
    Chen, Qiusheng
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62
  • [7] A Feature Map Fusion Self-Distillation Scheme for Image Classification Networks
    Qin, Zhenkai
    Ni, Shuiping
    Zhu, Mingfu
    Jia, Yue
    Liu, Shangxin
    Chen, Yawei
    ELECTRONICS, 2025, 14 (01):
  • [8] Sketch Classification and Sketch Based Image Retrieval Using ViT with Self-Distillation for Few Samples
    Kang, Sungjae
    Seo, Kisung
    JOURNAL OF ELECTRICAL ENGINEERING & TECHNOLOGY, 2024, 19 (07) : 4587 - 4593
  • [9] Reverse Self-Distillation Overcoming the Self-Distillation Barrier
    Ni, Shuiping
    Ma, Xinliang
    Zhu, Mingfu
    Li, Xingwang
    Zhang, Yu-Dong
    IEEE OPEN JOURNAL OF THE COMPUTER SOCIETY, 2023, 4 : 195 - 205
  • [10] SDDA: A progressive self-distillation with decoupled alignment for multimodal image–text classification
    Chen, Xiaohao
    Shuai, Qianjun
    Hu, Feng
    Cheng, Yongqiang
    Neurocomputing, 2025, 614