Uncertainty Driven Adaptive Self-Knowledge Distillation for Medical Image Segmentation

被引:0
|
作者
Guo, Xutao [1 ,2 ]
Wang, Mengqi [3 ]
Xiang, Yang [2 ]
Yang, Yanwu [1 ,2 ]
Ye, Chenfei [4 ,5 ]
Wang, Haijun [6 ,7 ]
Ma, Ting [4 ,8 ]
机构
[1] Harbin Inst Technol Shenzhen, Sch Elect & Informat Engn, Shenzhen 518055, Peoples R China
[2] Peng Cheng Lab, Shenzhen 518066, Peoples R China
[3] Shenzhen Univ, Shenzhen Peoples Hosp 2, Affiliated Hosp 1, Dept Neurooncol, Shenzhen 518037, Peoples R China
[4] Harbin Inst Technol Shenzhen, Sch Biomed Engn & Digital Hlth, Shenzhen 518055, Peoples R China
[5] Harbin Inst Technol, Int Res Inst Artificial Intelligence, Shenzhen 518055, Peoples R China
[6] Sun Yat Sen Univ, Affiliated Hosp 6, Dept Neurosurg, Guangzhou 510655, Peoples R China
[7] First Affiliated Hosp Sun Yat Sen, Dept Neurosurg, Guangzhou 510060, Peoples R China
[8] Harbin Inst Technol, Guangdong Prov Key Lab Aerosp Commun & Networking, Shenzhen 518055, Peoples R China
基金
中国国家自然科学基金;
关键词
Biomedical imaging; Training; Predictive models; Image segmentation; Adaptation models; Knowledge engineering; Estimation; Semantics; Computational modeling; Medical image segmentation; overfitting; knowledge distillation; uncertainty; cyclic ensembles; NEURAL-NETWORKS;
D O I
10.1109/TETCI.2025.3526259
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep learning has recently significantly improved the precision of medical image segmentation. However, due to the commonly limited dataset scale and reliance on hard labels (one-hot vectors) in medical image segmentation, deep learning models often overfit, which reduces segmentation performance. To mitigate the problem, we propose an uncertainty driven adaptive self-knowledge distillation (UAKD) model for medical image segmentation that regularizes the training process through self-generated soft labels. The innovation of UAKD is to integrate uncertainty estimation into soft label generation and student network training, ensuring accurate supervision and effective regularization. In detail, UAKD introduce teacher network ensembling to reduce semantic bias in soft labels caused by the teacher networks' fitting biases. An adaptive knowledge distillation mechanism is also proposed, which utilizes uncertainty to generate adaptive weights for soft labels to compute the loss function, thereby efficiently transferring reliable knowledge from the teacher network to the student network while suppressing unreliable information. Finally, we introduce a gradient ascent based cyclic ensemble method to reduce teacher network overfitting on the training data, further enhancing the aforementioned teacher ensembling and uncertainty estimation. Experiments on three medical image segmentation tasks show that UAKD outperforms existing regularization methods and demonstrates the effectiveness of uncertainty estimation for assessing soft label reliability.
引用
收藏
页数:14
相关论文
共 50 条
  • [21] Noisy Self-Knowledge Distillation for Text Summarization
    Liu, Yang
    Shen, Sheng
    Lapata, Mirella
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 692 - 703
  • [22] Self-knowledge distillation for surgical phase recognition
    Jinglu Zhang
    Santiago Barbarisi
    Abdolrahim Kadkhodamohammadi
    Danail Stoyanov
    Imanol Luengo
    International Journal of Computer Assisted Radiology and Surgery, 2024, 19 : 61 - 68
  • [23] Semantic Segmentation Using Pixel-Wise Adaptive Label Smoothing via Self-Knowledge Distillation for Limited Labeling Data
    Park, Sangyong
    Kim, Jaeseon
    Heo, Yong Seok
    SENSORS, 2022, 22 (07)
  • [24] Many-objective evolutionary self-knowledge distillation with adaptive branch fusion method
    Bai, Jiayuan
    Zhang, Yi
    INFORMATION SCIENCES, 2024, 669
  • [25] Personalized federated learning via decoupling self-knowledge distillation and global adaptive aggregation
    Tang, Zhiwei
    Xu, Shuwei
    Jin, Haozhe
    Liu, Shichong
    Zhai, Rui
    Lu, Ke
    MULTIMEDIA SYSTEMS, 2025, 31 (02)
  • [26] A Medical Image Segmentation Method Combining Knowledge Distillation and Contrastive Learning
    Ma, Xiaoxuan
    Shan, Sihan
    Sui, Dong
    Journal of Computers (Taiwan), 2024, 35 (03) : 363 - 377
  • [27] Knowledge distillation with ensembles of convolutional neural networks for medical image segmentation
    Noothout, Julia M. H.
    Lessmann, Nikolas
    van Eede, Matthijs C.
    van Harten, Louis D.
    Sogancioglu, Ecem
    Heslinga, Friso G.
    Veta, Mitko
    van Ginneken, Bram
    Isgum, Ivana
    JOURNAL OF MEDICAL IMAGING, 2022, 9 (05)
  • [28] Shape-intensity knowledge distillation for robust medical image segmentation
    Dong, Wenhui
    Du, Bo
    Xu, Yongchao
    FRONTIERS OF COMPUTER SCIENCE, 2025, 19 (09)
  • [29] Enhancing the Generalization Performance of Few-Shot Image Classification with Self-Knowledge Distillation
    Li, Liang
    Jin, Weidong
    Huang, Yingkun
    Ren, Junxiao
    STUDIES IN INFORMATICS AND CONTROL, 2022, 31 (02): : 71 - 80
  • [30] Few-shot image classification with improved similarity relationships in self-knowledge distillation
    Li, Liang
    Jin, Weidong
    Ren, Junxiao
    Huang, Yingkun
    Yan, Kang
    2022 41ST CHINESE CONTROL CONFERENCE (CCC), 2022, : 7053 - 7058