A Study on Automatic O-RADS Classification of Sonograms of Ovarian Adnexal Lesions Based on Deep Convolutional Neural Networks

被引:1
|
作者
Liu, Tao [1 ]
Miao, Kuo [1 ]
Tan, Gaoqiang [1 ]
Bu, Hanqi [1 ]
Shao, Xiaohui [1 ]
Wang, Siming [1 ]
Dong, Xiaoqiu [1 ]
机构
[1] Harbin Med Univ, Affiliated Hosp 4, Dept Ultrasound Med, Harbin, Heilongjiang, Peoples R China
关键词
Deep convolutional neural network (DCNN); Ovarian adnexal lesions; O-RADS; Sonograms; Deep learning; MULTICENTER; MANAGEMENT; DIAGNOSIS; MODEL;
D O I
10.1016/j.ultrasmedbio.2024.11.009
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
Objective: This study explored a new method for automatic O-RADS classification of sonograms based on a deep convolutional neural network (DCNN). Methods: A development dataset (DD) of 2,455 2D grayscale sonograms of 870 ovarian adnexal lesions and an intertemporal validation dataset (IVD) of 426 sonograms of 280 lesions were collected and classified according to O-RADS v2022 (categories 2-5) by three senior sonographers. Classification results verified by a two-tailed z-test to be consistent with the O-RADS v2022 malignancy rate indicated the diagnostic performance was comparable to that of a previous study and were used for training; otherwise, the classification was repeated by two different sonographers. The DD was used to develop three DCNN models (ResNet34, DenseNet121, and ConvNeXt-Tiny) that employed transfer learning techniques. Model performance was assessed for accuracy, precision, and F1 score, among others. The optimal model was selected and validated over time using the IVD and to analyze whether the efficiency of O-RADS classification was improved with the assistance of this model for three sonographers with different years of experience. Results: The proportion of malignant tumors in the DD and IVD in each O-RADS-defined risk category was verified using a two-tailed z-test. Malignant lesions (O-RADS categories 4 and 5) were diagnosed in the DD and IVD with sensitivities of 0.949 and 0.962 and specificities of 0.892 and 0.842, respectively. ResNet34, DenseNet121, and ConvNeXt-Tiny had overall accuracies of 0.737, 0.752, and 0.878, respectively, for sonogram prediction in the DD. The ConvNeXt-Tiny model's accuracy for sonogram prediction in the IVD was 0.859, with no significant difference between test sets. The modeling aid significantly reduced O-RADS classification time for three sonographers (Cohen's d = 5.75). Conclusion: ConvNeXt-Tiny showed robust and stable performance in classifying O-RADS 2-5, improving sonologists' classification efficacy.
引用
收藏
页码:387 / 395
页数:9
相关论文
共 50 条
  • [1] Systematic Review and Meta-Analysis of O-RADS Ultrasound and O-RADS MRI for Risk Assessment of Ovarian and Adnexal Lesions
    Zhang, Qing
    Dai, Xiaoli
    Li, Wei
    AMERICAN JOURNAL OF ROENTGENOLOGY, 2023, 221 (01) : 21 - 33
  • [2] Ultrasound image-based nomogram combining clinical, radiomics, and deep transfer learning features for automatic classification of ovarian masses according to O-RADS
    Liu, Lu
    Cai, Wenjun
    Tian, Hongyan
    Wu, Beibei
    Zhang, Jing
    Wang, Ting
    Hao, Yi
    Yue, Guanghui
    FRONTIERS IN ONCOLOGY, 2024, 14
  • [3] Developing a deep learning model for predicting ovarian cancer in Ovarian-Adnexal Reporting and Data System Ultrasound (O-RADS US) Category 4 lesions: A multicenter study
    Xie, Wenting
    Lin, Wenjie
    Li, Ping
    Lai, Hongwei
    Wang, Zhilan
    Liu, Peizhong
    Huang, Yijun
    Liu, Yao
    Tang, Lina
    Lyu, Guorong
    JOURNAL OF CANCER RESEARCH AND CLINICAL ONCOLOGY, 2024, 150 (07)
  • [4] Automatic Detection and Classification of Focal Liver Lesions Based on Deep Convolutional Neural Networks: A Preliminary Study
    Zhou, Jiarong
    Wang, Wenzhe
    Lei, Biwen
    Ge, Wenhao
    Huang, Yu
    Zhang, Linshi
    Yan, Yingcai
    Zhou, Dongkai
    Ding, Yuan
    Wu, Jian
    Wang, Weilin
    FRONTIERS IN ONCOLOGY, 2021, 10
  • [5] Diagnostic performance of a modified O-RADS classification system for adnexal lesions incorporating clinical features
    Wu, Minrong
    Cai, Songqi
    Zhu, Liuhong
    Yang, Daohui
    Huang, Shunfa
    Huang, Xiaolan
    Tang, Qiying
    Guan, Yingying
    Rao, Shengxiang
    Zhou, Jianjun
    ABDOMINAL RADIOLOGY, 2025, 50 (02) : 953 - 965
  • [6] A simplified approach to ovarian lesions based on the O-RADS US risk stratification and management system
    Mohamadian, Alireza
    Bayani, Leila
    Katouli, Fatemeh Shakki
    ULTRASONOGRAPHY, 2023, 42 (01) : 165 - 171
  • [7] A simplified approach to ovarian lesions based on the O-RADS US risk stratification and management system
    Mohamadian, Alireza
    Bayani, Leila
    Katouli, Fatemeh Shakki
    ULTRASONOGRAPHY, 2022, : 165 - 171
  • [8] Automatic Modulation Classification: Convolutional Deep Learning Neural Networks Approaches
    Hussein, Hany S.
    Essai Ali, Mohamed Hassan
    Ismeil, Mohammed
    Shaaban, Mohamed N.
    Mohamed, Mona Lotfy
    Atallah, Hany A.
    IEEE ACCESS, 2023, 11 : 98695 - 98705
  • [9] Multimodal ultrasound-based radiomics and deep learning for differential diagnosis of O-RADS 4–5 adnexal masses
    Song Zeng
    Haoran Jia
    Hao Zhang
    Xiaoyu Feng
    Meng Dong
    Lin Lin
    XinLu Wang
    Hua Yang
    Cancer Imaging, 25 (1)
  • [10] Deep Convolutional Neural Networks on Automatic Classification for Skin Tumour Images
    Simic, Svetlana
    Simic, Svetislav D.
    Bankovic, Zorana
    Ivkov-Simic, Milana
    Villar, Jose R.
    Simic, Dragan
    LOGIC JOURNAL OF THE IGPL, 2022, 30 (04) : 649 - 663