A2-FPN for semantic segmentation of fine-resolution remotely sensed images

被引:83
|
作者
Li, Rui [1 ]
Wang, Libo [1 ]
Zhang, Ce [2 ,3 ]
Duan, Chenxi [4 ]
Zheng, Shunyi [1 ]
机构
[1] Wuhan Univ, Sch Remote Sensing & Informat Engn, Wuhan, Peoples R China
[2] Univ Lancaster, Lancaster Environm Ctr, Lancaster, England
[3] UK Ctr Ecol & Hydrol, Lancaster, England
[4] Univ Twente, Fac Geoinformat Sci & Earth Observat ITC, Hengelosestr 99, NL-7514 AE Enschede, Netherlands
基金
中国国家自然科学基金;
关键词
semantic segmentation; deep learning; attention mechanism; LAND-COVER; NETWORK;
D O I
10.1080/01431161.2022.2030071
中图分类号
TP7 [遥感技术];
学科分类号
081102 ; 0816 ; 081602 ; 083002 ; 1404 ;
摘要
The thriving development of earth observation technology makes more and more high-resolution remote-sensing images easy to obtain. However, caused by fine-resolution, the huge spatial and spectral complexity leads to the automation of semantic segmentation becoming a challenging task. Addressing such an issue represents an exciting research field, which paves the way for scene-level landscape pattern analysis and decision-making. To tackle this problem, we propose an approach for automatic land segmentation based on the Feature Pyramid Network (FPN). As a classic architecture, FPN can build a feature pyramid with high-level semantics throughout. However, intrinsic defects in feature extraction and fusion hinder FPN from further aggregating more discriminative features. Hence, we propose an Attention Aggregation Module (AAM) to enhance multiscale feature learning through attention-guided feature aggregation. Based on FPN and AAM, a novel framework named Attention Aggregation Feature Pyramid Network (A(2)-FPN) is developed for semantic segmentation of fine-resolution remotely sensed images. Extensive experiments conducted on four datasets demonstrate the effectiveness of our A(2)-FPN in segmentation accuracy. Code is available at .
引用
收藏
页码:1131 / 1155
页数:25
相关论文
共 50 条
  • [21] Significance of texture features in the segmentation of remotely sensed images
    Usha, S. Gandhimathi Alias
    Vasuki, S.
    OPTIK, 2022, 249
  • [22] Level set segmentation of remotely sensed hyperspectral images
    Ball, JE
    Bruce, LM
    IGARSS 2005: IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, VOLS 1-8, PROCEEDINGS, 2005, : 5638 - 5642
  • [23] Significance of texture features in the segmentation of remotely sensed images
    Usha, S. Gandhimathi Alias
    Vasuki, S.
    Optik, 2022, 249
  • [24] Geostatistical simulation of fine-resolution NDVI images
    Frykman, P
    Sandholt, I
    Norgaard, A
    GIS and Spatial Analysis, Vol 1and 2, 2005, : 133 - 138
  • [25] Bispace Domain Adaptation Network for Remotely Sensed Semantic Segmentation
    Liu, Wei
    Su, Fulin
    Jin, Xinfei
    Li, Hongxu
    Qin, Rongjun
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
  • [26] A2-FPN: Attention aggregation based feature pyramid network for instance segmentation
    Department of Electronic Engineering, Tsinghua University
    arXiv,
  • [27] A2-FPN: Attention Aggregation based Feature Pyramid Network for Instance Segmentation
    Hu, Miao
    Li, Yali
    Fang, Lu
    Wang, Shengjin
    Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2021, : 15338 - 15347
  • [28] BUILDING CHANGE DETECTION FOR HIGH-RESOLUTION REMOTELY SENSED IMAGES BASED ON A SEMANTIC DEPENDENCY
    Zhong, Chen
    Xu, Qizhi
    Yang, Feng
    Hu, Lei
    2015 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS), 2015, : 3345 - 3348
  • [29] SCALE SELECTION BASED ON MORAN'S I FOR SEGMENTATION OF HIGH RESOLUTION REMOTELY SENSED IMAGES
    Meng, Yan
    Lin, Chao
    Cuiand, Weihong
    Yao, Jian
    2014 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS), 2014, : 4895 - 4898
  • [30] Semantic edge-guided object segmentation from high-resolution remotely sensed imagery
    Xia, Liegang
    Luo, Jiancheng
    Zhang, Junxia
    Zhu, Zhiwen
    Gao, Lijing
    Yang, Haiping
    INTERNATIONAL JOURNAL OF REMOTE SENSING, 2021, 42 (24) : 9434 - 9458