Towards a Multi-Temporal Deep Learning Approach for Mapping Urban Fabric Using Sentinel 2 Images

被引:32
作者
El Mendili, Lamiae [1 ]
Puissant, Anne [2 ]
Chougrad, Mehdi [1 ]
Sebari, Imane [1 ]
机构
[1] IAVH2, Sch Geomat & Surveying Engn, Rabat 10101, Morocco
[2] Univ Strasbourg, Dept Geog, CNRS, LIVE,UMR 7362, F-67000 Strasbourg, France
关键词
urban fabric; Sentinel-2; deep-learning; multi-temporal approach; LAND-COVER CLASSIFICATION; TIME-SERIES;
D O I
10.3390/rs12030423
中图分类号
X [环境科学、安全科学];
学科分类号
08 ; 0830 ;
摘要
The major part of the population lives in urban areas, and this is expected to increase in the future. The main challenges faced by cities currently and towards the future are the rapid urbanization, the increase in urban temperature and the urban heat island. Mapping and monitoring urban fabric (UF) to analyze the environmental impact of these phenomena is more necessary than ever. This coupled with the increased availability of Earth observation data and their growing temporal capabilities leads us to consider using temporal features for improving land use classification, especially in urban environments where the spectral overlap between classes makes it challenging. Urban land use classification thus remains a central question in remote sensing. Although some research studies have successfully used multi-temporal images such as Landsat-8 or Sentinel-2 to improve land cover classification, urban land use mapping is rarely carried using the temporal dimension. This paper explores the use of Sentinel-2 data in a deep learning framework, by firstly assessing the temporal robustness of four popular fully convolutional neural networks (FCNs) trained over single-date images for the classification of the urban footprint, and secondly, by proposing a multi-temporal FCN. A performance comparison between the proposed framework and a regular FCN is also conducted. In this study, we consider four UF classes typical of many European Western cities. Results show that training the proposed multi-date model on Sentinel 2 multi-temporal data achieved the best results with a Kappa coefficient increase of 2.72% and 6.40%, respectively for continuous UF and industrial facilities. Although a more definitive conclusion requires further testing, first results are promising because they confirm that integrating the temporal dimension with a high spatial resolution into urban land use classification may be a valuable strategy to discriminate among several urban categories.
引用
收藏
页数:21
相关论文
共 43 条
[1]  
[Anonymous], 2018, SYNTH LECT COMPUT VI
[2]  
[Anonymous], 2016, ACCV
[3]  
[Anonymous], VERY DEEP CONVOLUTIO
[4]   Beyond RGB: Very high resolution urban remote sensing with multimodal deep networks [J].
Audebert, Nicolas ;
Le Saux, Bertrand ;
Lefevre, Sebastien .
ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING, 2018, 140 :20-32
[5]   SegNet: A Deep Convolutional Encoder-Decoder Architecture for Image Segmentation [J].
Badrinarayanan, Vijay ;
Kendall, Alex ;
Cipolla, Roberto .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2017, 39 (12) :2481-2495
[6]   Comprehensive survey of deep learning in remote sensing: theories, tools, and challenges for the community [J].
Ball, John E. ;
Anderson, Derek T. ;
Chan, Chee Seng .
JOURNAL OF APPLIED REMOTE SENSING, 2017, 11
[7]   M3Fusion: A Deep Learning Architecture for Multiscale Multimodal Multitemporal Satellite Data Fusion [J].
Benedetti, Paola ;
Ienco, Dino ;
Gaetano, Raffaele ;
Ose, Kenji ;
Pensa, Ruggero G. ;
Dupuy, Stephane .
IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2018, 11 (12) :4939-4949
[8]   Contribution of multispectral and multiternporal information from MODIS images to land cover classification [J].
Carrao, Hugo ;
Goncalves, Paulo ;
Caetano, Mario .
REMOTE SENSING OF ENVIRONMENT, 2008, 112 (03) :986-997
[9]  
Cazaubiel V., 2008, P INT C SPAC OPT ICS, P95
[10]   Deep learning for time series classification: a review [J].
Fawaz, Hassan Ismail ;
Forestier, Germain ;
Weber, Jonathan ;
Idoumghar, Lhassane ;
Muller, Pierre-Alain .
DATA MINING AND KNOWLEDGE DISCOVERY, 2019, 33 (04) :917-963