Less-supervised learning with knowledge distillation for sperm morphology analysis

被引:0
作者
Nabipour, Ali [1 ]
Nejati, Mohammad Javad Shams [1 ]
Boreshban, Yasaman [1 ]
Mirroshandel, Seyed Abolghasem [1 ]
机构
[1] Univ Guilan, Fac Engn, Dept Comp Engn, 5th Kilometer Persian Gulf Highway,POB 1841, Rasht, Guilan, Iran
关键词
Human sperm morphometry; sperm defects; infertility; deep learning; knowledge distillation;
D O I
10.1080/21681163.2024.2347978
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Sperm Morphology Analysis (SMA) is pivotal in diagnosing male infertility. However, manual analysis is subjective and time-intensive. Artificial intelligence presents automated alternatives, but hurdles like limited data and image quality constraints hinder its efficacy. These challenges impede Deep Learning (DL) models from grasping crucial sperm features. A solution enabling DL models to learn sample nuances, even with limited data, would be invaluable. This study proposes a Knowledge Distillation (KD) method to distinguish normal from abnormal sperm cells, leveraging the Modified Human Sperm Morphology Analysis dataset. Despite low-resolution, blurry images, our method yields relevant results. We exclusively utilize normal samples to train the model for anomaly detection, crucial in scenarios lacking abnormal data - a common issue in medical tasks. Our aim is to train an Anomaly Detection model using a dataset comprising unclear images and limited samples, without direct exposure to abnormal data. Our method achieves Receiver ROC/AUC scores of 70.4%, 87.6%, and 71.1% for head, vacuole, and acrosome, respectively, our method matches traditional DL model performance with less than 70% of the data. This less-supervised approach shows promise in advancing SMA despite data scarcity. Furthermore, KD enables model adaptability to edge devices in fertility clinics, requiring less processing power.
引用
收藏
页数:16
相关论文
共 50 条
[31]   The Role of Masking for Efficient Supervised Knowledge Distillation of Vision Transformers [J].
Son, Seungwoo ;
Ryu, Jegwang ;
Le, Namhoon ;
Lee, Jaeho .
COMPUTER VISION - ECCV 2024, PT LXVII, 2025, 15125 :379-396
[32]   Trade-off Analysis between Knowledge Distillation and Federated Learning in Distributed Edge System [J].
Joaquim, Molo Mbasa .
PROCEEDINGS OF THE 33RD INTERNATIONAL SYMPOSIUM ON HIGH-PERFORMANCE PARALLEL AND DISTRIBUTED COMPUTING, HPDC 2024, 2024,
[33]   Block change learning for knowledge distillation [J].
Choi, Hyunguk ;
Lee, Younkwan ;
Yow, Kin Choong ;
Jeon, Moongu .
INFORMATION SCIENCES, 2020, 513 :360-371
[34]   Skill enhancement learning with knowledge distillation [J].
Liu, Naijun ;
Sun, Fuchun ;
Fang, Bin ;
Liu, Huaping .
SCIENCE CHINA-INFORMATION SCIENCES, 2024, 67 (08)
[35]   KNOWLEDGE DISTILLATION FOR WIRELESS EDGE LEARNING [J].
Mohamed, Ahmed P. ;
Fameel, Abu Shafin Mohammad Mandee ;
El Gamal, Aly .
2021 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP (SSP), 2021, :600-604
[36]   A Survey of Knowledge Distillation in Deep Learning [J].
Shao R.-R. ;
Liu Y.-A. ;
Zhang W. ;
Wang J. .
Jisuanji Xuebao/Chinese Journal of Computers, 2022, 45 (08) :1638-1673
[37]   Learning Interpretation with Explainable Knowledge Distillation [J].
Alharbi, Raed ;
Vu, Minh N. ;
Thai, My T. .
2021 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2021, :705-714
[38]   Speech Enhancement Using Dynamic Learning in Knowledge Distillation via Reinforcement Learning [J].
Chu, Shih-Chuan ;
Wu, Chung-Hsien ;
Su, Tsai-Wei .
IEEE ACCESS, 2023, 11 :144421-144434
[39]   BookKD: A novel knowledge distillation for reducing distillation costs by decoupling knowledge generation and learning [J].
Zhu, Songling ;
Shang, Ronghua ;
Tang, Ke ;
Xu, Songhua ;
Li, Yangyang .
KNOWLEDGE-BASED SYSTEMS, 2023, 279
[40]   Feature fusion-based collaborative learning for knowledge distillation [J].
Li, Yiting ;
Sun, Liyuan ;
Gou, Jianping ;
Du, Lan ;
Ou, Weihua .
INTERNATIONAL JOURNAL OF DISTRIBUTED SENSOR NETWORKS, 2021, 17 (11)