UAV and Machine Learning Based Refinement of a Satellite-Driven Vegetation Index for Precision Agriculture

被引:92
作者
Mazzia, Vittorio [1 ,2 ]
Comba, Lorenzo [3 ,4 ]
Khaliq, Aleem [1 ,2 ]
Chiaberge, Marcello [1 ,2 ]
Gay, Paolo [3 ]
机构
[1] Politecn Torino, Dept Elect & Telecommun, Corso Duca Abruzzi 24, I-10129 Turin, Italy
[2] Politecn Interdept Ctr Serv Robot, PIC4SeR, I-10129 Turin, Italy
[3] Univ Torino, Dept Agr Forest & Food Sci, Largo Paolo Braccini 2, I-10095 Grugliasco, TO, Italy
[4] Politecn Torino, Natl Res Council Italy, Inst Elect Comp & Telecommun Engn, Corso Duca Abruzzi 24, I-10129 Turin, Italy
关键词
precision agriculture; remote sensing; moderate resolution satellite imagery; UAV; convolutional neural network;
D O I
10.3390/s20092530
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Precision agriculture is considered to be a fundamental approach in pursuing a low-input, high-efficiency, and sustainable kind of agriculture when performing site-specific management practices. To achieve this objective, a reliable and updated description of the local status of crops is required. Remote sensing, and in particular satellite-based imagery, proved to be a valuable tool in crop mapping, monitoring, and diseases assessment. However, freely available satellite imagery with low or moderate resolutions showed some limits in specific agricultural applications, e.g., where crops are grown by rows. Indeed, in this framework, the satellite's output could be biased by intra-row covering, giving inaccurate information about crop status. This paper presents a novel satellite imagery refinement framework, based on a deep learning technique which exploits information properly derived from high resolution images acquired by unmanned aerial vehicle (UAV) airborne multispectral sensors. To train the convolutional neural network, only a single UAV-driven dataset is required, making the proposed approach simple and cost-effective. A vineyard in Serralunga d'Alba (Northern Italy) was chosen as a case study for validation purposes. Refined satellite-driven normalized difference vegetation index (NDVI) maps, acquired in four different periods during the vine growing season, were shown to better describe crop status with respect to raw datasets by correlation analysis and ANOVA. In addition, using a K-means based classifier, 3-class vineyard vigor maps were profitably derived from the NDVI maps, which are a valuable tool for growers.
引用
收藏
页数:16
相关论文
共 57 条
[1]  
Abadi M, 2016, PROCEEDINGS OF OSDI'16: 12TH USENIX SYMPOSIUM ON OPERATING SYSTEMS DESIGN AND IMPLEMENTATION, P265
[2]  
[Anonymous], ICML
[3]  
[Anonymous], 2016, SENSORS BASEL, DOI [DOI 10.3390/S16122099, DOI 10.3390/s16122099]
[4]   Review. Precision Viticulture. Research topics, challenges and opportunities in site-specific vineyard management [J].
Arno, J. ;
Martinez-Casasnovas, J. A. ;
Ribes-Dasi, M. ;
Rosell, J. R. .
SPANISH JOURNAL OF AGRICULTURAL RESEARCH, 2009, 7 (04) :779-790
[5]   Convolutional network architectures for super-resolution/sub-pixel mapping of drone-derived images [J].
Arun, Pattathal V. ;
Herrmann, Ittai ;
Budhiraju, Krishna M. ;
Karnieli, Arnon .
PATTERN RECOGNITION, 2019, 88 :431-446
[6]   A multi-stage tracking for mustard rot disease combining surface meteorology and satellite remote sensing [J].
Bhattacharya, B. K. ;
Chattopadhyay, C. .
COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2013, 90 :35-44
[7]   A comparison between multispectral aerial and satellite imagery in precision viticulture [J].
Borgogno-Mondino, E. ;
Lessio, A. ;
Tarricone, L. ;
Novello, V. ;
de Palma, L. .
PRECISION AGRICULTURE, 2018, 19 (02) :195-217
[8]   Convolutional low-resolution fine-grained classification [J].
Cai, Dingding ;
Chen, Ke ;
Qian, Yanlin ;
Kamarainen, Joni-Kristian .
PATTERN RECOGNITION LETTERS, 2019, 119 :166-171
[9]  
Clevert D.-A., 2015, FAST ACCURATE DEEP N
[10]   Leaf Area Index evaluation in vineyards using 3D point clouds from UAV imagery [J].
Comba, L. ;
Biglia, A. ;
Aimonino, D. Ricauda ;
Tortia, C. ;
Mania, E. ;
Guidoni, S. ;
Gay, P. .
PRECISION AGRICULTURE, 2020, 21 (04) :881-896