A Novel ConvLSTM-Based U-net for Improved Brain Tumor Segmentation

被引:1
|
作者
Almiahi, Osama Majeed Hilal [1 ]
Albu-Salih, Alaa Taima [1 ]
Alhajim, Dhafer [2 ]
机构
[1] Univ Al Qadisiyah, Coll Comp Sci & Informat Technol, Al Diwaniyah 58002, Iraq
[2] Univ Al Qadisiyah, Comp Ctr, Al Diwaniyah 58002, Iraq
来源
IEEE ACCESS | 2024年 / 12卷
关键词
Brain tumor; deep learning; ConvLSTM; up skip connection; U-net; CLASSIFICATION; NETWORKS;
D O I
10.1109/ACCESS.2024.3483562
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Using 2D scans or simple 3D convolutions are two limitations of previous works on segmentation of brain tumors by deep learning, which lead to ignoring the temporal distribution of the scans. This study proposes a novel extension to the well-known U-net model for brain tumor segmentation, utilizing 3D Magnetic Resonance Imaging (MRI) volumes as inputs. The method, called ConvLSTM-based U-net + up skip connections, incorporates the ConvLSTM blocks to capture spatio-temporal dependencies in the 3D MRI volumes, and up skip connections to capture low-level feature maps extracted from the encoding path, enhancing the information flow through the network to the standard U-net architecture. A novel intensity normalization technique is used to improve the comparability of scans. This technique normalizes image intensity by subtracting the grey-value of the most frequent bin from the image. The novel method is tested on the Multimodal Brain Tumor Segmentation (BRATS) 2015 dataset, showing that the use of ConvLSTM blocks improved segmentation quality by 1.6% on the test subset. The addition of skip connections further improved performance by 3.3% and 1.7% relative to the U-net and ConvLSTM-based U-net models, respectively. Moreover, the inclusion of up skip connections could enhance the performance by 5.7%, 3.99% and 2.2% relative to the simple U-net, ConvLSTM-based U-net, and ConvLSTM-based U-net with skip connections, respectively. Finally, the novel preprocessing technique had a positive effect on the proposed network, resulting in a 3.3% increase in the segmentation outcomes.
引用
收藏
页码:157346 / 157358
页数:13
相关论文
共 50 条
  • [31] Brain tumor segmentation using U-Net in conjunction with EfficientNet
    Lin, Shu-You
    Lin, Chun-Ling
    PEERJ COMPUTER SCIENCE, 2024, 10
  • [32] Hybrid Pyramid U-Net Model for Brain Tumor Segmentation
    Kong, Xiangmao
    Sun, Guoxia
    Wu, Qiang
    Liu, Ju
    Lin, Fengming
    INTELLIGENT INFORMATION PROCESSING IX, 2018, 538 : 346 - 355
  • [33] Modified U-Net for Automatic Brain Tumor Regions Segmentation
    Kaewrak, Keerati
    Soraghan, John
    Di Caterina, Gaetano
    Grose, Derek
    2019 27TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2019,
  • [34] Joint Learning of Segmentation and Overall Survival for Brain Tumor based on U-Net
    Kwon, Junmo
    Park, Hyunjin
    2023 IEEE 36TH INTERNATIONAL SYMPOSIUM ON COMPUTER-BASED MEDICAL SYSTEMS, CBMS, 2023, : 925 - 926
  • [35] A Robust Segmentation Method Based on Improved U-Net
    Gang Sha
    Junsheng Wu
    Bin Yu
    Neural Processing Letters, 2021, 53 : 2947 - 2965
  • [36] Brain tumor segmentation and classification using optimized U-Net
    Shiny, K., V
    IMAGING SCIENCE JOURNAL, 2024, 72 (02) : 204 - 219
  • [37] Magnetic Resonance Brain Tumor Image Segmentation Based on Attention U-Net
    Ai Lingmei
    Li Tiandong
    Liao Fuyuan
    Shi Kangzhen
    LASER & OPTOELECTRONICS PROGRESS, 2020, 57 (14)
  • [38] RMU-Net: A Novel Residual Mobile U-Net Model for Brain Tumor Segmentation from MR Images
    Saeed, Muhammad Usman
    Ali, Ghulam
    Bin, Wang
    Almotiri, Sultan H.
    AlGhamdi, Mohammed A.
    Nagra, Arfan Ali
    Masood, Khalid
    ul Amin, Riaz
    ELECTRONICS, 2021, 10 (16)
  • [39] An improved U-net based retinal vessel image segmentation method
    Ren, Kan
    Chang, Longdan
    Wan, Minjie
    Gu, Guohua
    Chen, Qian
    HELIYON, 2022, 8 (10)
  • [40] AResU-Net: Attention Residual U-Net for Brain Tumor Segmentation
    Zhang, Jianxin
    Lv, Xiaogang
    Zhang, Hengbo
    Liu, Bin
    SYMMETRY-BASEL, 2020, 12 (05):