A Study on Automatic O-RADS Classification of Sonograms of Ovarian Adnexal Lesions Based on Deep Convolutional Neural Networks

被引:1
|
作者
Liu, Tao [1 ]
Miao, Kuo [1 ]
Tan, Gaoqiang [1 ]
Bu, Hanqi [1 ]
Shao, Xiaohui [1 ]
Wang, Siming [1 ]
Dong, Xiaoqiu [1 ]
机构
[1] Harbin Med Univ, Affiliated Hosp 4, Dept Ultrasound Med, Harbin, Heilongjiang, Peoples R China
关键词
Deep convolutional neural network (DCNN); Ovarian adnexal lesions; O-RADS; Sonograms; Deep learning; MULTICENTER; MANAGEMENT; DIAGNOSIS; MODEL;
D O I
10.1016/j.ultrasmedbio.2024.11.009
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
Objective: This study explored a new method for automatic O-RADS classification of sonograms based on a deep convolutional neural network (DCNN). Methods: A development dataset (DD) of 2,455 2D grayscale sonograms of 870 ovarian adnexal lesions and an intertemporal validation dataset (IVD) of 426 sonograms of 280 lesions were collected and classified according to O-RADS v2022 (categories 2-5) by three senior sonographers. Classification results verified by a two-tailed z-test to be consistent with the O-RADS v2022 malignancy rate indicated the diagnostic performance was comparable to that of a previous study and were used for training; otherwise, the classification was repeated by two different sonographers. The DD was used to develop three DCNN models (ResNet34, DenseNet121, and ConvNeXt-Tiny) that employed transfer learning techniques. Model performance was assessed for accuracy, precision, and F1 score, among others. The optimal model was selected and validated over time using the IVD and to analyze whether the efficiency of O-RADS classification was improved with the assistance of this model for three sonographers with different years of experience. Results: The proportion of malignant tumors in the DD and IVD in each O-RADS-defined risk category was verified using a two-tailed z-test. Malignant lesions (O-RADS categories 4 and 5) were diagnosed in the DD and IVD with sensitivities of 0.949 and 0.962 and specificities of 0.892 and 0.842, respectively. ResNet34, DenseNet121, and ConvNeXt-Tiny had overall accuracies of 0.737, 0.752, and 0.878, respectively, for sonogram prediction in the DD. The ConvNeXt-Tiny model's accuracy for sonogram prediction in the IVD was 0.859, with no significant difference between test sets. The modeling aid significantly reduced O-RADS classification time for three sonographers (Cohen's d = 5.75). Conclusion: ConvNeXt-Tiny showed robust and stable performance in classifying O-RADS 2-5, improving sonologists' classification efficacy.
引用
收藏
页码:387 / 395
页数:9
相关论文
共 50 条
  • [41] Fused Deep Convolutional Neural Networks Based on Voting Approach for Efficient Object Classification
    Ali, Redha
    Ragb, Hussin K.
    PROCEEDINGS OF THE 2019 IEEE NATIONAL AEROSPACE AND ELECTRONICS CONFERENCE (NAECON), 2019, : 335 - 339
  • [42] Automatic triage of twelve-lead electrocardiograms using deep convolutional neural networks: a first implementation study
    van de Leur, Rutger R.
    van Sleuwen, Meike T. G. M.
    Zwetsloot, Peter-Paul M.
    van der Harst, Pim
    Doevendans, Pieter A.
    Hassink, Rutger J.
    van Es, Rene
    EUROPEAN HEART JOURNAL - DIGITAL HEALTH, 2024, 5 (01): : 89 - 96
  • [43] A deep learning method based on convolutional neural network for automatic modulation classification of wireless signals
    Yu Xu
    Dezhi Li
    Zhenyong Wang
    Qing Guo
    Wei Xiang
    Wireless Networks, 2019, 25 : 3735 - 3746
  • [44] Workpiece image-based tool wear classification in blanking processes using deep convolutional neural networks
    Molitor, Dirk Alexander
    Kubik, Christian
    Hetfleisch, Ruben Helmut
    Groche, Peter
    PRODUCTION ENGINEERING-RESEARCH AND DEVELOPMENT, 2022, 16 (04): : 481 - 492
  • [45] Deep convolutional neural networks for automatic classification of gastric carcinoma using whole slide images in digital histopathology
    Sharma, Harshita
    Zerbe, Norman
    Klempert, Iris
    Hellwich, Olaf
    Hufnagl, Peter
    COMPUTERIZED MEDICAL IMAGING AND GRAPHICS, 2017, 61 : 2 - 13
  • [46] AUTOMATIC SEGMENTATION AND CARDIOPATHY CLASSIFICATION IN CARDIAC MRI IMAGES BASED ON DEEP NEURAL NETWORKS
    Chang, Yakun
    Song, Baoyu
    Jung, Cheolkon
    Huang, Liyu
    2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2018, : 1020 - 1024
  • [47] A Deep Learning Model Based on Convolutional Neural Networks for Classification of Magnetic Resonance Prostate Images
    Uysal, Fatih
    Hardalac, Firat
    Koc, Mustafa
    ARTIFICIAL INTELLIGENCE AND APPLIED MATHEMATICS IN ENGINEERING PROBLEMS, 2020, 43 : 701 - 708
  • [48] Automatic Defect Classification for Infrared Thermography in CFRP based on Deep Learning Dense Convolutional Neural Network
    Liu, Guozeng
    Gao, Weicheng
    Liu, Wei
    Chen, Yijiao
    Wang, Tianlong
    Xie, Yongzhi
    Bai, Weiliang
    Li, Zijing
    JOURNAL OF NONDESTRUCTIVE EVALUATION, 2024, 43 (03)
  • [49] Nonlinear time series classification using bispectrum-based deep convolutional neural networks
    Parker, Paul A.
    Holan, Scott H.
    Ravishanker, Nalini
    APPLIED STOCHASTIC MODELS IN BUSINESS AND INDUSTRY, 2020, 36 (05) : 877 - 890
  • [50] Accurate Surface Condition Classification of High Voltage Insulators based on Deep Convolutional Neural Networks
    Serikbay, Arailym
    Bagheri, Mehdi
    Zollanvari, Amin
    Phung, B. T.
    IEEE TRANSACTIONS ON DIELECTRICS AND ELECTRICAL INSULATION, 2021, 28 (06) : 2126 - 2133