Uncertainty Driven Adaptive Self-Knowledge Distillation for Medical Image Segmentation

被引:0
|
作者
Guo, Xutao [1 ,2 ]
Wang, Mengqi [3 ]
Xiang, Yang [2 ]
Yang, Yanwu [1 ,2 ]
Ye, Chenfei [4 ,5 ]
Wang, Haijun [6 ,7 ]
Ma, Ting [4 ,8 ]
机构
[1] Harbin Inst Technol Shenzhen, Sch Elect & Informat Engn, Shenzhen 518055, Peoples R China
[2] Peng Cheng Lab, Shenzhen 518066, Peoples R China
[3] Shenzhen Univ, Shenzhen Peoples Hosp 2, Affiliated Hosp 1, Dept Neurooncol, Shenzhen 518037, Peoples R China
[4] Harbin Inst Technol Shenzhen, Sch Biomed Engn & Digital Hlth, Shenzhen 518055, Peoples R China
[5] Harbin Inst Technol, Int Res Inst Artificial Intelligence, Shenzhen 518055, Peoples R China
[6] Sun Yat Sen Univ, Affiliated Hosp 6, Dept Neurosurg, Guangzhou 510655, Peoples R China
[7] First Affiliated Hosp Sun Yat Sen, Dept Neurosurg, Guangzhou 510060, Peoples R China
[8] Harbin Inst Technol, Guangdong Prov Key Lab Aerosp Commun & Networking, Shenzhen 518055, Peoples R China
基金
中国国家自然科学基金;
关键词
Biomedical imaging; Training; Predictive models; Image segmentation; Adaptation models; Knowledge engineering; Estimation; Semantics; Computational modeling; Medical image segmentation; overfitting; knowledge distillation; uncertainty; cyclic ensembles; NEURAL-NETWORKS;
D O I
10.1109/TETCI.2025.3526259
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep learning has recently significantly improved the precision of medical image segmentation. However, due to the commonly limited dataset scale and reliance on hard labels (one-hot vectors) in medical image segmentation, deep learning models often overfit, which reduces segmentation performance. To mitigate the problem, we propose an uncertainty driven adaptive self-knowledge distillation (UAKD) model for medical image segmentation that regularizes the training process through self-generated soft labels. The innovation of UAKD is to integrate uncertainty estimation into soft label generation and student network training, ensuring accurate supervision and effective regularization. In detail, UAKD introduce teacher network ensembling to reduce semantic bias in soft labels caused by the teacher networks' fitting biases. An adaptive knowledge distillation mechanism is also proposed, which utilizes uncertainty to generate adaptive weights for soft labels to compute the loss function, thereby efficiently transferring reliable knowledge from the teacher network to the student network while suppressing unreliable information. Finally, we introduce a gradient ascent based cyclic ensemble method to reduce teacher network overfitting on the training data, further enhancing the aforementioned teacher ensembling and uncertainty estimation. Experiments on three medical image segmentation tasks show that UAKD outperforms existing regularization methods and demonstrates the effectiveness of uncertainty estimation for assessing soft label reliability.
引用
收藏
页数:14
相关论文
共 50 条
  • [41] Teaching Yourself: A Self-Knowledge Distillation Approach to Action Recognition
    Duc-Quang Vu
    Le, Ngan
    Wang, Jia-Ching
    IEEE ACCESS, 2021, 9 : 105711 - 105723
  • [42] Two-Stage Approach for Targeted Knowledge Transfer in Self-Knowledge Distillation
    Yin, Zimo
    Pu, Jian
    Zhou, Yijie
    Xue, Xiangyang
    IEEE-CAA JOURNAL OF AUTOMATICA SINICA, 2024, 11 (11) : 2270 - 2283
  • [43] From Knowledge Distillation to Self-Knowledge Distillation: A Unified Approach with Normalized Loss and Customized Soft Labels
    Yang, Zhendong
    Zeng, Ailing
    Li, Zhe
    Zhang, Tianke
    Yuan, Chun
    Li, Yu
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 17139 - 17148
  • [44] Two-Stage Approach for Targeted Knowledge Transfer in Self-Knowledge Distillation
    Zimo Yin
    Jian Pu
    Yijie Zhou
    Xiangyang Xue
    IEEE/CAA Journal of Automatica Sinica, 2024, 11 (11) : 2270 - 2283
  • [45] Self-knowledge distillation based on knowledge transfer from soft to hard examples
    Tang, Yuan
    Chen, Ying
    Xie, Linbo
    IMAGE AND VISION COMPUTING, 2023, 135
  • [46] Towards Cross-Modality Medical Image Segmentation with Online Mutual Knowledge Distillation
    Li, Kang
    Yu, Lequan
    Wang, Shujun
    Heng, Pheng-Ann
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 775 - 783
  • [47] StAlK: Structural Alignment based Self Knowledge distillation for Medical Image Classification
    Sharma, Saurabh
    Kumar, Atul
    Monpara, Jenish
    Chandra, Joydeep
    KNOWLEDGE-BASED SYSTEMS, 2024, 304
  • [48] A Lightweight Approach for Network Intrusion Detection based on Self-Knowledge Distillation
    Yang, Shuo
    Zheng, Xinran
    Xu, Zhengzhuo
    Wang, Xingjun
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 3000 - 3005
  • [49] Interactive medical image segmentation with self-adaptive confidence calibration
    Shen, Chuyun
    Li, Wenhao
    Xu, Qisen
    Hu, Bin
    Jin, Bo
    Cai, Haibin
    Zhu, Fengping
    Li, Yuxin
    Wang, Xiangfeng
    FRONTIERS OF INFORMATION TECHNOLOGY & ELECTRONIC ENGINEERING, 2023, 24 (09) : 1332 - 1348
  • [50] Enhanced ProtoNet With Self-Knowledge Distillation for Few-Shot Learning
    Habib, Mohamed El Hacen
    Kucukmanisa, Ayhan
    Urhan, Oguzhan
    IEEE ACCESS, 2024, 12 : 145331 - 145340