Comparison of an Optimised Multiresolution Segmentation Approach with Deep Neural Networks for Delineating Agricultural Fields from Sentinel-2 Images

被引:0
作者
Gideon Okpoti Tetteh
Marcel Schwieder
Stefan Erasmi
Christopher Conrad
Alexander Gocht
机构
[1] Thünen Institute of Farm Economics,Geography Department
[2] Humboldt University of Berlin,undefined
[3] Martin-Luther-University Halle-Wittenberg,undefined
[4] Institute of Geosciences and Geography,undefined
来源
PFG – Journal of Photogrammetry, Remote Sensing and Geoinformation Science | 2023年 / 91卷
关键词
Agriculture; Image segmentation; Deep learning; Semantic segmentation; Instance segmentation;
D O I
暂无
中图分类号
学科分类号
摘要
Effective monitoring of agricultural lands requires accurate spatial information about the locations and boundaries of agricultural fields. Through satellite imagery, such information can be mapped on a large scale at a high temporal frequency. Various methods exist in the literature for segmenting agricultural fields from satellite images. Edge-based, region-based, or hybrid segmentation methods are traditional methods that have widely been used for segmenting agricultural fields. Lately, the use of deep neural networks (DNNs) for various tasks in remote sensing has been gaining traction. Therefore, to identify the optimal method for segmenting agricultural fields from satellite images, we evaluated three state-of-the-art DNNs, namely Mask R-CNN, U-Net, and FracTAL ResUNet against the multiresolution segmentation (MRS) algorithm, which is a region-based and a more traditional segmentation method. Given that the DNNs are supervised methods, we used an optimised version of the MRS algorithm based on supervised Bayesian optimisation. Monotemporal Sentinel-2 (S2) images acquired in Lower Saxony, Germany were used in this study. Based on the agricultural parcels declared by farmers within the European Common Agricultural Policy (CAP) framework, the segmentation results of each method were evaluated using the F-score and intersection over union (IoU) metrics. The respective average F-score and IoU obtained by each method are 0.682 and 0.524 for Mask R-CNN, 0.781 and 0.646 for U-Net, 0.808 and 0.683 for FracTAL ResUNet, and 0.805 and 0.678 for the optimised MRS approach. This study shows that DNNs, particularly FracTAL ResUNet, can be effectively used for large-scale segmentation of agricultural fields from satellite images.
引用
收藏
页码:295 / 312
页数:17
相关论文
共 240 条
  • [1] Akcay O(2018)Assessment of segmentation parameters for object-based land cover classification using color-infrared imagery ISPRS Int J Geo-Inf 7 424-981
  • [2] Avsar EO(2013)Advances in remote sensing of agriculture: context description, existing operational monitoring systems and major information needs Remote Sens 5 949-523
  • [3] Inalpulat M(2018)Sentinel-2 cropland mapping using pixel-based and object-based time-weighted dynamic time warping analysis Remote Sens Environ 204 509-16
  • [4] Genc L(2010)Object based image analysis for remote sensing ISPRS J Photogramm Remote Sens 65 2-358
  • [5] Cam A(2022)Mapping of crop types and crop sequences with combined time series of sentinel-1, sentinel-2 and landsat 8 data for Germany Remote Sens Environ 269 112831-215
  • [6] Atzberger C(2011)Monitoring US agriculture: the US department of agriculture, national agricultural statistics service Cropland Data Layer Program Geocarto Int 26 341-848
  • [7] Belgiu M(2009)Object- and pixel-based analysis for mapping crops and their agro-environmental associated measures using QuickBird imagery Comput Electron Agric 68 207-114
  • [8] Csillik O(2017)DeepLab: semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected CRFs IEEE Trans Pattern Anal Mach Intell 40 834-153
  • [9] Blaschke T(2021)From parcel to continental scale—a first European crop type map based on Sentinel-1 and LUCAS Copernicus in-situ observations Remote Sens Environ 266 112708-342
  • [10] Blickensdörfer L(2020)ResUNet-a: a deep learning framework for semantic segmentation of remotely sensed data ISPRS J Photogramm Remote Sens 162 94-3763