SAR U-Net: Spatial attention residual U-Net structure for water body segmentation from remote sensing satellite images

被引:2
|
作者
Jonnala, Naga Surekha [1 ]
Gupta, Neha [1 ]
机构
[1] VIT AP Univ, Sch Elect Engn, Amaravati 522237, AP, India
关键词
Water Body Segmentation; Residual Block; U-Net; Spatial Attention Module; Satellite Images; EXTRACTION;
D O I
10.1007/s11042-023-16965-8
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The analysis of remote-sensing images always requires the extraction of data about the aquatic environment. However, it might be challenging to identify any water surface since the backgrounds of water zones in remote sensing images are usually complicated structures and dense vegetation. Furthermore, less significant tributaries and edge data could not be accurately detected using traditional water detection methods. As a result, a spatial attention residual U-Net architecture is proposed to enhance the effectiveness of water body segmentation. The suggested approach reweights the feature representation spatially to obtain data on water features, using U-Net as the network architecture. The feature of the water zone is obtained using the residual block. It obtains more precise local position data for the water zone, which enhance edge segmentation accuracy. The spatial attention module retrieves, segregates, and combines the low-level information and high-level information as two discrete inputs in various dimensions. To effectively segregate the water region from the context, the spatial attention module combines spatial features with deep contextual information. The experiments are performed using satellite images of kaggle dataset aquatic bodies and a real-time dataset. The results of the experiments reveal 96% of accuracy that the suggested strategy out performs the existing models.
引用
收藏
页码:44425 / 44454
页数:30
相关论文
共 50 条
  • [1] SAR U-Net: Spatial attention residual U-Net structure for water body segmentation from remote sensing satellite images
    Naga Surekha Jonnala
    Neha Gupta
    Multimedia Tools and Applications, 2024, 83 : 44425 - 44454
  • [2] Convolutional block attention module U-Net: a method to improve attention mechanism and U-Net for remote sensing images
    Zhang, Yanjun
    Kong, Jiayuan
    Long, Sifang
    Zhu, Yuanhao
    He, Fushuai
    JOURNAL OF APPLIED REMOTE SENSING, 2022, 16 (02)
  • [3] Segmentation of Mammogram Images Using U-Net with Fusion of Channel and Spatial Attention Modules (U-Net CASAM)
    Robert Singh, A.
    Vidya, S.
    Hariharasitaraman, S.
    Athisayamani, Suganya
    Hsu, Fang Rong
    Lecture Notes in Networks and Systems, 2024, 966 LNNS : 435 - 448
  • [4] Aircraft segmentation in remote sensing images based on multi-scale residual U-Net with attention
    Xuqi Wang
    Shanwen Zhang
    Lei Huang
    Multimedia Tools and Applications, 2024, 83 : 17855 - 17872
  • [5] Aircraft segmentation in remote sensing images based on multi-scale residual U-Net with attention
    Wang, Xuqi
    Zhang, Shanwen
    Huang, Lei
    MULTIMEDIA TOOLS AND APPLICATIONS, 2024, 83 (06) : 17855 - 17872
  • [6] Chaining a U-Net With a Residual U-Net for Retinal Blood Vessels Segmentation
    Alfonso Francia, Gendry
    Pedraza, Carlos
    Aceves, Marco
    Tovar-Arriaga, Saul
    IEEE ACCESS, 2020, 8 : 38493 - 38500
  • [7] An improved U-Net method for the semantic segmentation of remote sensing images
    Zhongbin Su
    Wei Li
    Zheng Ma
    Rui Gao
    Applied Intelligence, 2022, 52 : 3276 - 3288
  • [8] An improved U-Net method for the semantic segmentation of remote sensing images
    Su, Zhongbin
    Li, Wei
    Ma, Zheng
    Gao, Rui
    APPLIED INTELLIGENCE, 2022, 52 (03) : 3276 - 3288
  • [9] Breast tumor segmentation in ultrasound images: comparing U-net and U-net + +
    de Oliveira, Carlos Eduardo Gonçalves
    Vieira, Sílvio Leão
    Paranaiba, Caio Felipe Brito
    Itikawa, Emerson Nobuyuki
    Research on Biomedical Engineering, 2025, 41 (01)
  • [10] HARNU-Net: Hierarchical Attention Residual Nested U-Net for Change Detection in Remote Sensing Images
    Li, Haojin
    Wang, Liejun
    Cheng, Shuli
    SENSORS, 2022, 22 (12)