Less-supervised learning with knowledge distillation for sperm morphology analysis

被引:0
|
作者
Nabipour, Ali [1 ]
Nejati, Mohammad Javad Shams [1 ]
Boreshban, Yasaman [1 ]
Mirroshandel, Seyed Abolghasem [1 ]
机构
[1] Univ Guilan, Fac Engn, Dept Comp Engn, 5th Kilometer Persian Gulf Highway,POB 1841, Rasht, Guilan, Iran
来源
COMPUTER METHODS IN BIOMECHANICS AND BIOMEDICAL ENGINEERING-IMAGING AND VISUALIZATION | 2024年 / 12卷 / 01期
关键词
Human sperm morphometry; sperm defects; infertility; deep learning; knowledge distillation;
D O I
10.1080/21681163.2024.2347978
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Sperm Morphology Analysis (SMA) is pivotal in diagnosing male infertility. However, manual analysis is subjective and time-intensive. Artificial intelligence presents automated alternatives, but hurdles like limited data and image quality constraints hinder its efficacy. These challenges impede Deep Learning (DL) models from grasping crucial sperm features. A solution enabling DL models to learn sample nuances, even with limited data, would be invaluable. This study proposes a Knowledge Distillation (KD) method to distinguish normal from abnormal sperm cells, leveraging the Modified Human Sperm Morphology Analysis dataset. Despite low-resolution, blurry images, our method yields relevant results. We exclusively utilize normal samples to train the model for anomaly detection, crucial in scenarios lacking abnormal data - a common issue in medical tasks. Our aim is to train an Anomaly Detection model using a dataset comprising unclear images and limited samples, without direct exposure to abnormal data. Our method achieves Receiver ROC/AUC scores of 70.4%, 87.6%, and 71.1% for head, vacuole, and acrosome, respectively, our method matches traditional DL model performance with less than 70% of the data. This less-supervised approach shows promise in advancing SMA despite data scarcity. Furthermore, KD enables model adaptability to edge devices in fertility clinics, requiring less processing power.
引用
收藏
页数:16
相关论文
共 50 条
  • [1] Domain Knowledge Distillation and Supervised Contrastive Learning for Industrial Process Monitoring
    Ai, Mingxi
    Xie, Yongfang
    Ding, Steven X. X.
    Tang, Zhaohui
    Gui, Weihua
    IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 2023, 70 (09) : 9452 - 9462
  • [2] Self-supervised knowledge distillation for complementary label learning
    Liu, Jiabin
    Li, Biao
    Lei, Minglong
    Shi, Yong
    NEURAL NETWORKS, 2022, 155 : 318 - 327
  • [3] SCL-IKD: intermediate knowledge distillation via supervised contrastive representation learning
    Saurabh Sharma
    Shikhar Singh Lodhi
    Joydeep Chandra
    Applied Intelligence, 2023, 53 : 28520 - 28541
  • [4] SCL-IKD: intermediate knowledge distillation via supervised contrastive representation learning
    Sharma, Saurabh
    Lodhi, Shikhar Singh
    Chandra, Joydeep
    APPLIED INTELLIGENCE, 2023, 53 (23) : 28520 - 28541
  • [5] Self-Supervised Contrastive Learning for Camera-to-Radar Knowledge Distillation
    Wang, Wenpeng
    Campbell, Bradford
    Munir, Sirajum
    2024 20TH INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING IN SMART SYSTEMS AND THE INTERNET OF THINGS, DCOSS-IOT 2024, 2024, : 154 - 161
  • [6] Knowledge Distillation Meets Open-Set Semi-supervised Learning
    Yang, Jing
    Zhu, Xiatian
    Bulat, Adrian
    Martinez, Brais
    Tzimiropoulos, Georgios
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2025, 133 (01) : 315 - 334
  • [7] Image quality assessment based on self-supervised learning and knowledge distillation
    Sang, Qingbing
    Shu, Ziru
    Liu, Lixiong
    Hu, Cong
    Wu, Qin
    JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2023, 90
  • [8] SKILL: SIMILARITY-AWARE KNOWLEDGE DISTILLATION FOR SPEECH SELF-SUPERVISED LEARNING
    Zampierin, Luca
    Hacene, Ghouthi Boukli
    Nguyen, Bac
    Ravanelli, Mirco
    2024 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING WORKSHOPS, ICASSPW 2024, 2024, : 675 - 679
  • [9] A Semi-Supervised Federated Learning Scheme via Knowledge Distillation for Intrusion Detection
    Zhao, Ruijie
    Yang, Linbo
    Wang, Yijun
    Xue, Zhi
    Gui, Guan
    Ohtsukit, Tomoaki
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 2688 - 2693
  • [10] Collaborative deep semi-supervised learning with knowledge distillation for surface defect classification
    Manivannan, Siyamalan
    COMPUTERS & INDUSTRIAL ENGINEERING, 2023, 186