On the performance of fusion based planet-scope and Sentinel-2 data for crop classification using inception inspired deep convolutional neural network

被引:22
作者
Minallah, Nasru [1 ,2 ]
Tariq, Mohsin [2 ]
Aziz, Najam [1 ,2 ]
Khan, Waleed [1 ,2 ]
Rehman, Atiq ur [3 ]
Belhaouari, Samir Brahim [3 ]
机构
[1] Univ Engn & Technol Peshawar, Dept Comp Syst Engn, Peshawar, KP, Pakistan
[2] Univ Engn & Technol UET Peshawar, Natl Ctr Big Data & Cloud Comp NCBC, Peshawar, KP, Pakistan
[3] Hamad Bin Khalifa Univ, Coll Sci & Engn, ICT Div, Doha, Qatar
关键词
LAND-COVER;
D O I
10.1371/journal.pone.0239746
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
This research work aims to develop a deep learning-based crop classification framework for remotely sensed time series data.Tobaccois a major revenue generating crop of Khyber Pakhtunkhwa (KP) province of Pakistan, with over 90% of the country'sTobaccoproduction. In order to analyze the performance of the developed classification framework, a pilot sub-region named Yar Hussain is selected for experimentation work. Yar Hussain is a tehsil of district Swabi, within KP province of Pakistan, having highest contribution to the gross production of the KPTobaccocrop. KP generally consists of a diverse crop land with different varieties of vegetation, having similar phenology which makes crop classification a challenging task. In this study, a temporal convolutional neural network (TempCNNs) model is implemented for crop classification, while considering remotely sensed imagery of the selected pilot region with specific focus on theTobaccocrop. In order to improve the performance of the proposed classification framework, instead of using the prevailing concept of utilizing a single satellite imagery, both Sentinel-2 and Planet-Scope imageries are stacked together to assist in providing more diverse features to the proposed classification framework. Furthermore, instead of using a single date satellite imagery, multiple satellite imageries with respect to the phenological cycle ofTobaccocrop are temporally stacked together which resulted in a higher temporal resolution of the employed satellite imagery. The developed framework is trained using the ground truth data. The final output is obtained as an outcome of the SoftMax function of the developed model in the form of probabilistic values, for the classification of the selected classes. The proposed deep learning-based crop classification framework, while utilizing multi-satellite temporally stacked imagery resulted in an overall classification accuracy of 98.15%. Furthermore, as the developed classification framework evolved with specific focus onTobaccocrop, it resulted in bestTobaccocrop classification accuracy of 99%.
引用
收藏
页数:16
相关论文
共 27 条
[1]  
[Anonymous], 2017, PLANET TEAM PLANET A, V2017, P40
[2]  
[Anonymous], 2017, RUMIYAH 3 0106, V5, P8
[3]  
[Anonymous], 2019, CIRCULATION, V2019
[4]  
[Anonymous], 2015, Measurement Science and Technology
[5]   SENTINEL-2 LEVEL 1 PRODUCTS AND IMAGE PROCESSING PERFORMANCES [J].
Baillarin, S. J. ;
Meygret, A. ;
Dechoz, C. ;
Petrucci, B. ;
Lacherade, S. ;
Tremas, T. ;
Isola, C. ;
Martimort, P. ;
Spoto, F. .
2012 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS), 2012, :7003-7006
[6]   Comprehensive survey of deep learning in remote sensing: theories, tools, and challenges for the community [J].
Ball, John E. ;
Anderson, Derek T. ;
Chan, Chee Seng .
JOURNAL OF APPLIED REMOTE SENSING, 2017, 11
[7]  
ENVI, 2000, ENV VIS IM TUT ENVI
[8]   Deep learning for time series classification: a review [J].
Fawaz, Hassan Ismail ;
Forestier, Germain ;
Weber, Jonathan ;
Idoumghar, Lhassane ;
Muller, Pierre-Alain .
DATA MINING AND KNOWLEDGE DISCOVERY, 2019, 33 (04) :917-963
[9]   DEM Generation from Multi Satellite PlanetScope Imagery [J].
Ghuffar, Sajid .
REMOTE SENSING, 2018, 10 (09)
[10]   Land Cover Classification via Multitemporal Spatial Data by Deep Recurrent Neural Networks [J].
Ienco, Dino ;
Gaetano, Raffaele ;
Dupaquier, Claire ;
Maurel, Pierre .
IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2017, 14 (10) :1685-1689