A Conditionally Parameterized Feature Fusion U-Net for Building Change Detection

被引:1
|
作者
Gu, Yao [1 ]
Ren, Chao [1 ,2 ]
Chen, Qinyi [1 ]
Bai, Haoming [1 ]
Huang, Zhenzhong [1 ]
Zou, Lei [1 ]
机构
[1] Guilin Univ Technol, Coll Geomat & Geoinformat, Guilin 541006, Peoples R China
[2] Guangxi Key Lab Spatial Informat & Geomat, Guilin 541004, Peoples R China
基金
中国国家自然科学基金;
关键词
building change detection; small buildings; attention mechanism; feature fusion;
D O I
10.3390/su16219232
中图分类号
X [环境科学、安全科学];
学科分类号
08 ; 0830 ;
摘要
The semantic richness of remote sensing images often presents challenges in building detection, such as edge blurring, loss of detail, and low resolution. To address these issues and improve boundary precision, this paper proposes CCCUnet, a hybrid architecture developed for enhanced building extraction. CCCUnet integrates CondConv, Coord Attention, and a CGAFusion module to overcome the limitations of traditional U-Net-based methods. Additionally, the NLLLoss function is utilized in classification tasks to optimize model parameters during training. CondConv replaces standard convolution operations in the U-Net encoder, boosting model capacity and performance in building change detection while ensuring efficient inference. Coord Attention enhances the detection of complex contours in small buildings by utilizing its attention mechanism. Furthermore, the CGAFusion module combines channel and spatial attention in the skip connection structure, capturing both spatial and channel-wise correlations. Experimental results demonstrate that CCCUnet achieves high accuracy in building change detection, with improved edge refinement and the better detection of small building contours. Thus, CCCUnet serves as a valuable tool for precise building extraction from remote sensing images, with broad applications in urban planning, land use, and disaster monitoring.
引用
收藏
页数:20
相关论文
共 50 条
  • [1] UIU-Net: U-Net in U-Net for Infrared Small Object Detection
    Wu, Xin
    Hong, Danfeng
    Chanussot, Jocelyn
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2023, 32 : 364 - 376
  • [2] Multi-Scale Feature Fusion Dehazing Network Based on U-net
    Dong, Qianyu
    Yang, Qiuxiang
    Zhao, Yin
    LASER & OPTOELECTRONICS PROGRESS, 2025, 62 (06)
  • [3] Fault Identification of U-Net Based on Enhanced Feature Fusion and Attention Mechanism
    Sun, Qifeng
    Wang, Xin
    Ni, Hongsheng
    Gong, Faming
    Du, Qizhen
    ELECTRONICS, 2023, 12 (12)
  • [4] Adaptive Feature Weighted Fusion Nested U-Net with Discrete Wavelet Transform for Change Detection of High-Resolution Remote Sensing Images
    Wang, Congcong
    Sun, Wenbin
    Fan, Deqin
    Liu, Xiaoding
    Zhang, Zhi
    REMOTE SENSING, 2021, 13 (24)
  • [5] EFP-Net: A Novel Building Change Detection Method Based on Efficient Feature Fusion and Foreground Perception
    He, Renjie
    Li, Wenyao
    Mei, Shaohui
    Dai, Yuchao
    He, Mingyi
    Liu, Wen
    REMOTE SENSING, 2023, 15 (22)
  • [6] Embedded U-Net: combines multiple feature fusion encode and subpixel reconstruction for microcracks salient object detection
    Pan, Yunlong
    Wang, Sen
    Cui, Yu
    Zhang, Yinhui
    JOURNAL OF ELECTRONIC IMAGING, 2022, 31 (02)
  • [7] HARNU-Net: Hierarchical Attention Residual Nested U-Net for Change Detection in Remote Sensing Images
    Li, Haojin
    Wang, Liejun
    Cheng, Shuli
    SENSORS, 2022, 22 (12)
  • [8] U-Net Based Feature Interaction Segmentation Method
    Sun J.
    Hui Z.
    Tang C.
    Wu X.
    Moshi Shibie yu Rengong Zhineng/Pattern Recognition and Artificial Intelligence, 2021, 34 (11): : 1058 - 1068
  • [9] An Improved U-Net Infrared Small Target Detection Algorithm Based on Multi-Scale Feature Decomposition and Fusion and Attention Mechanism
    Fan, Xiangsuo
    Ding, Wentao
    Li, Xuyang
    Li, Tingting
    Hu, Bo
    Shi, Yuqiu
    SENSORS, 2024, 24 (13)
  • [10] A vegetation classification method based on improved dual-way branch feature fusion U-net
    Yu, Huiling
    Jiang, Dapeng
    Peng, Xiwen
    Zhang, Yizhuo
    FRONTIERS IN PLANT SCIENCE, 2022, 13