Image Segmentation of Rectal Tumor Based on Improved U-Net Model with Deep Learning

被引:1
|
作者
Zhou, Faguo [1 ]
Ye, Yuansheng [1 ]
Song, Yanan [1 ]
机构
[1] China Univ Min & Technol, Sch Mech Elect & Informat Engn, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
Rectal tumor segmentation; Fuzzy logic; Attention mechanism; U-Net Model; Loop-back residual network; EDGE; ALGORITHM;
D O I
10.1007/s11265-021-01710-x
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Rectal tumor is a common malignancy in the intestine. The death rate of rectal tumor ranks fourth among the malignant tumors of digestive system, which seriously threaten the life and health of patients. Endoscopic ultrasonography is the most commonly used method to detect rectal tumors. After obtaining CT images, doctors diagnose the condition with the naked eye and experience, which brings a certain workload to both the doctor and the patient. With the development of in-depth learning and the continuous iterative convolution neural network, more and more techniques have been applied in the field of medical image. Therefore, this paper studies and improves an ultrasonic image segmentation U-Net model for rectal tumors based on fuzzy logic attention mechanism. This paper first preprocesses the original image, enhances the details and reduces the image size.Then the image feature map is weighted by fuzzy logic and attention mechanism. In addition, the loop-back residual mechanism is used to optimize the model. At last, the results of several models are analyzed and compared. The results show that, compared with the U-Net model, the optimized model has a nearly 3% increase in image segmentation precision, almost unchanged recall, and both IoU and Dice have increased by about 2%. Overall, the model has good segmentation performance, and the introduction of RoI aware U-Net greatly reduces the use of video memory.
引用
收藏
页码:1145 / 1157
页数:13
相关论文
共 50 条
  • [41] Semantic segmentation of urban environments: Leveraging U-Net deep learning model for cityscape image analysis
    Arulananth, T. S.
    Kuppusamy, P. G.
    Ayyasamy, Ramesh Kumar
    Alhashmi, Saadat M.
    Mahalakshmi, M.
    Vasanth, K.
    Chinnasamy, P.
    PLOS ONE, 2024, 19 (04):
  • [42] A retinal vessel segmentation method based improved U-Net model
    Sun, Kun
    Chen, Yang
    Chao, Yi
    Geng, Jiameng
    Chen, Yinsheng
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2023, 82
  • [43] An Improved Small Feature Segmentation Algorithm Based on U-Net Model
    Chen, Weikun
    Yao, Changjuan
    Yang, Ming
    Zhang, Yanchang
    2024 IEEE INTERNATIONAL CONFERENCE ON MECHATRONICS AND AUTOMATION, ICMA 2024, 2024, : 591 - 595
  • [44] Imaging segmentation mechanism for rectal tumors using improved U-Net
    Zhang, Kenan
    Yang, Xiaotang
    Cui, Yanfen
    Zhao, Jumin
    Li, Dengao
    BMC MEDICAL IMAGING, 2024, 24 (01)
  • [45] Fundus Retinal Vessels Image Segmentation Method Based on Improved U-Net
    Han, J.
    Wang, Y.
    Gong, H.
    IRBM, 2022, 43 (06) : 628 - 639
  • [46] Activated Sludge Microscopic Image Segmentation Method Based on Improved U-Net
    Zhao Lijie
    Lu Xingkui
    Chen Bin
    LASER & OPTOELECTRONICS PROGRESS, 2021, 58 (12)
  • [47] Femur segmentation in X-ray image based on improved U-Net
    Fan Lianghui
    Han JunGang
    Jia Yang
    Yang Bin
    2019 THE 5TH INTERNATIONAL CONFERENCE ON ELECTRICAL ENGINEERING, CONTROL AND ROBOTICS (EECR 2019), 2019, 533
  • [48] Cervical Image Segmentation using U-Net Model
    Liu, Yao
    Bai, Bing
    Chen, Hua-Ching
    Liu, Peizhong
    Feng, Hsuan-Ming
    2019 INTERNATIONAL SYMPOSIUM ON INTELLIGENT SIGNAL PROCESSING AND COMMUNICATION SYSTEMS (ISPACS), 2019,
  • [49] Breast Tumor Segmentation in Ultrasound Images Based on U-NET Model
    Michael, Epimack
    Ma, He
    Qi, Shouliang
    PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON INNOVATIONS IN COMPUTING RESEARCH (ICR'22), 2022, 1431 : 22 - 31
  • [50] Automated segmentation of brain tumor based on improved U-Net with residual units
    Huang, Chuanbo
    Wan, Minghua
    MULTIMEDIA TOOLS AND APPLICATIONS, 2022, 81 (09) : 12543 - 12566