Automatic detection of breast lesions in automated 3D breast ultrasound with cross-organ transfer learning

被引:0
作者
Lingyun B.A.O. [1 ]
HUANG Z. [2 ]
LIN Z. [2 ]
SUN Y. [2 ]
CHEN H. [3 ]
LI Y. [4 ]
LI Z. [5 ,6 ]
YUAN X. [2 ]
XU L. [7 ]
TAN T. [2 ]
机构
[1] Affiliated Hangzhou First People's Hospital, School of Medicine, Westlake University, Hangzhou
[2] Faculty of Applied Sciences, Macao Polytechnic University, Macao
[3] Pathology Department, Changsha First Hospital, Changsha
[4] Radiology Department, Changsha First Hospital, Changsha
[5] College of Aerospace Science and Engineering, National University of Defense Technology, Changsha
[6] Hunan Provincial Key Laboratory of Image Measurement and Vision Navigation, Changsha
[7] School of Information Science and Technology, Shanghaitech University, Shanghai
来源
Virtual Reality and Intelligent Hardware | 2024年 / 6卷 / 03期
关键词
Automated 3D breast ultrasound; Breast cancers; Breast ultrasound; Computer-aided diagnosis; Convolutional neural networks; Cross organ learning; Deep learning; Transfer learning;
D O I
10.1016/j.vrih.2024.02.001
中图分类号
学科分类号
摘要
Background: Deep convolutional neural networks have garnered considerable attention in numerous machine learning applications, particularly in visual recognition tasks such as image and video analyses. There is a growing interest in applying this technology to diverse applications in medical image analysis. Automated three-dimensional Breast Ultrasound is a vital tool for detecting breast cancer, and computer-assisted diagnosis software, developed based on deep learning, can effectively assist radiologists in diagnosis. However, the network model is prone to overfitting during training, owing to challenges such as insufficient training data. This study attempts to solve the problem caused by small datasets and improve model detection performance. Methods: We propose a breast cancer detection framework based on deep learning (a transfer learning method based on cross-organ cancer detection) and a contrastive learning method based on breast imaging reporting and data systems (BI-RADS). Results: When using cross organ transfer learning and BIRADS based contrastive learning, the average sensitivity of the model increased by a maximum of 16.05%. Conclusion: Our experiments have demonstrated that the parameters and experiences of cross-organ cancer detection can be mutually referenced, and contrastive learning method based on BI-RADS can improve the detection performance of the model. © 2024 Beijing Zhongke Journal Publishing Co. Ltd
引用
收藏
页码:239 / 251
页数:12
相关论文
共 32 条
  • [1] Berg W.A., Blume J.D., Cormack J.B., Mendelson E.B., Lehrer D., Bohm-Velez M., Pisano E.D., Jong R.A., Evans W.P., Morton M.J., Mahoney M.C., Larsen L.H., Barr R.G., Farria D.M., Marques H.S., Boparai K., Investigators A.C.R.I.N., Combined screening with ultrasound and mammography vs mammography alone in women at elevated risk of breast cancer, JAMA, 299, 18, pp. 2151-2163, (2008)
  • [2] Shen S., Zhou Y., Xu Y., Zhang B., Duan X., Huang R., A multi-centre randomised trial comparing ultrasound vs mammography for screening breast cancer in high-risk chinese women, The British Journal of Cancer, 6, pp. 998-1004, (2015)
  • [3] Kelly K.M., Dean J., Comulada W.S., Lee S.J., Breast cancer detection using automated whole breast ultrasound and mammography in radiographically dense breasts, European Radiology, 20, 3, pp. 734-742, (2010)
  • [4] Huang Q.H., Luo Y.Z., Zhang Q.Z., Breast ultrasound image segmentation: a survey, International Journal of Computer Assisted Radiology and Surgery, 12, 3, pp. 493-507, (2017)
  • [5] Huang Q.H., Yang F.B., Liu L.Z., Li X.L., Automatic segmentation of breast lesions for interaction in ultrasonic computer-aided diagnosis, Information Sciences, 314, pp. 293-310, (2015)
  • [6] Huang Q.H., Zeng Z.Z., A review on real-time 3D ultrasound imaging technology, BioMed Research International, 2017, (2017)
  • [7] Tan T., Platel B., Mus R., Tabar L., Mann R.M., Karssemeijer N., Computer-aided detection of cancer in automated 3-D breast ultrasound, IEEE Transactions on Medical Imaging, 32, 9, pp. 1698-1706, (2013)
  • [8] Weiss K., Khoshgoftaar T.M., Wang D.D., A survey of transfer learning, Journal of Big Data, 3, 1, pp. 1-40, (2016)
  • [9] He K., Fan H., Wu Y., Xie S., Girshick R., Momentum contrast for unsupervised visual representation learning, Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 9, pp. 729-973, (2020)
  • [10] Zhang T., Tan T., Han L., Appelman L., Veltman J., Wessels R., Duvivier K.M., Loo C., Gao Y., Wang X., Predicting breast cancer types on and beyond molecular level in a multi-modal fashion, NPJ Breast Cancer, 9, (2023)