A Transformer-Based Knowledge Distillation Network for Cortical Cataract Grading

被引:1
|
作者
Wang, Jinhong [1 ,2 ]
Xu, Zhe [3 ]
Zheng, Wenhao [1 ,2 ]
Ying, Haochao [4 ]
Chen, Tingting [1 ,2 ]
Liu, Zuozhu [5 ]
Chen, Danny Z. [6 ]
Yao, Ke [3 ]
Wu, Jian [7 ,8 ]
机构
[1] Zhejiang Univ, Affiliated Hosp 2, Coll Comp Sci & Technol, Hangzhou 310027, Peoples R China
[2] Zhejiang Univ, Affiliated Hosp 2, Eye Ctr, Hangzhou 310027, Peoples R China
[3] Zhejiang Univ, Affiliated Hosp 2, Eye Ctr, Sch Med, Hangzhou 310009, Zhejiang, Peoples R China
[4] Zhejiang Univ, Sch Publ Hlth, Hangzhou 310058, Peoples R China
[5] Zhejiang Univ, ZJU UIUC Inst, Res & Dev Ctr Intelligent Healthcare, ZJU Angelalign Inc, Haining 310058, Peoples R China
[6] Univ Notre Dame, Dept Comp Sci & Engn, Notre Dame, IN 46556 USA
[7] Zhejiang Univ, Affiliated Hosp 2, Sch Med, Sch Publ Hlth, Hangzhou 310058, Peoples R China
[8] Zhejiang Univ, Inst Wenzhou, Hangzhou 310058, Peoples R China
关键词
Cataracts; Transformers; Annotations; Feature extraction; Image edge detection; Fuses; Knowledge engineering; Cataract grading; knowledge distillation; transformer; medical imaging classification; CLASSIFICATION; IMAGES;
D O I
10.1109/TMI.2023.3327274
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Cortical cataract, a common type of cataract, is particularly difficult to be diagnosed automatically due to the complex features of the lesions. Recently, many methods based on edge detection or deep learning were proposed for automatic cataract grading. However, these methods suffer a large performance drop in cortical cataract grading due to the more complex cortical opacities and uncertain data. In this paper, we propose a novel Transformer-based Knowledge Distillation Network, called TKD-Net, for cortical cataract grading. To tackle the complex opacity problem, we first devise a zone decomposition strategy to extract more refined features and introduce special sub-scores to consider critical factors of clinical cortical opacity assessment (location, area, density) for comprehensive quantification. Next, we develop a multi-modal mix-attention Transformer to efficiently fuse sub-scores and image modality for complex feature learning. However, obtaining the sub-score modality is a challenge in the clinic, which could cause the modality missing problem instead. To simultaneously alleviate the issues of modality missing and uncertain data, we further design a Transformer-based knowledge distillation method, which uses a teacher model with perfect data to guide a student model with modality-missing and uncertain data. We conduct extensive experiments on a dataset of commonly-used slit-lamp images annotated by the LOCS III grading system to demonstrate that our TKD-Net outperforms state-of-the-art methods, as well as the effectiveness of its key components.
引用
收藏
页码:1089 / 1101
页数:13
相关论文
共 50 条
  • [1] Knowledge Distillation and Transformer-Based Framework for Automatic Spine CT Report Generation
    Batool, Humaira
    Mukhtar, Asmat
    Gul Khawaja, Sajid
    Alghamdi, Norah Saleh
    Mansoor Khan, Asad
    Qayyum, Adil
    Adil, Ruqqayia
    Khan, Zawar
    Usman Akram, Muhammad
    Usman Akbar, Muhammad
    Eklund, Anders
    IEEE ACCESS, 2025, 13 : 42949 - 42964
  • [2] A Transformer-Based Network for Hyperspectral Object Tracking
    Gao, Long
    Chen, Langkun
    Liu, Pan
    Jiang, Yan
    Xie, Weiying
    Li, Yunsong
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2023, 61
  • [3] DeNKD: Decoupled Non-Target Knowledge Distillation for Complementing Transformer-Based Unsupervised Domain Adaptation
    Mei, Zhen
    Ye, Peng
    Li, Baopu
    Chen, Tao
    Fan, Jiayuan
    Ouyang, Wanli
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (05) : 3220 - 3231
  • [4] Transformer-Based Distillation Hash Learning for Image Retrieval
    Lv, Yuanhai
    Wang, Chongyan
    Yuan, Wanteng
    Qian, Xiaohao
    Yang, Wujun
    Zhao, Wanqing
    ELECTRONICS, 2022, 11 (18)
  • [5] TraKDis: A Transformer-Based Knowledge Distillation Approach for Visual Reinforcement Learning With Application to Cloth Manipulation
    Chen, Wei
    Rojas, Nicolas
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (03) : 2455 - 2462
  • [6] A Distributed Knowledge Distillation Framework for Financial Fraud Detection Based on Transformer
    Tang, Yuxuan
    Liu, Zhanjun
    IEEE ACCESS, 2024, 12 : 62899 - 62911
  • [7] A transformer-based low-resolution face recognition method via on-and-offline knowledge distillation
    Song, Yaozhe
    Tang, Hongying
    Meng, Fangzhou
    Wang, Chaoyi
    Wu, Mengmeng
    Shu, Ziting
    Tong, Guanjun
    NEUROCOMPUTING, 2022, 509 : 193 - 205
  • [8] GeoFormer: An Effective Transformer-Based Siamese Network for UAV Geolocalization
    Li, Qingge
    Yang, Xiaogang
    Fan, Jiwei
    Lu, Ruitao
    Tang, Bin
    Wang, Siyu
    Su, Shuang
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2024, 17 : 9470 - 9491
  • [9] Transformer-Based Attention Network for In-Vehicle Intrusion Detection
    Nguyen, Trieu Phong
    Nam, Heungwoo
    Kim, Daehee
    IEEE ACCESS, 2023, 11 : 55389 - 55403
  • [10] A Transformer-Based Signal Denoising Network for AoA Estimation in NLoS Environments
    Liu, Junchen
    Wang, Tianyu
    Li, Yuxiao
    Li, Cheng
    Wang, Yi
    Shen, Yuan
    IEEE COMMUNICATIONS LETTERS, 2022, 26 (10) : 2336 - 2339