Hydra: An Ensemble of Convolutional Neural Networks for Geospatial Land Classification

被引:117
作者
Minetto, Rodrigo [1 ,2 ]
Segundo, Mauricio Pamplona [1 ,3 ]
Sarkar, Sudeep [4 ]
机构
[1] USF, Comp Vis & Pattern Recognit Grp, Tampa, FL 33620 USA
[2] Univ Tecnol Fed Parana UTFPR, Dept Informat, BR-80230901 Curitiba, Parana, Brazil
[3] Univ Fed Bahia UFBA, Dept Comp Sci, BR-40110110 Salvador, BA, Brazil
[4] USF, Dept Comp Sci & Engn, Tampa, FL 33620 USA
来源
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING | 2019年 / 57卷 / 09期
关键词
Convolutional neural network (CNN); ensemble learning; functional map of world (FMOW); geospatial land classification; online data augmentation; remote sensing image classification;
D O I
10.1109/TGRS.2019.2906883
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
In this paper, we describe Hydra, an ensemble of convolutional neural networks (CNNs) for geospatial land classification. The idea behind Hydra is to create an initial CNN that is coarsely optimized but provides a good starting pointing for further optimization, which will serve as the Hydra's body. Then, the obtained weights are fine-tuned multiple times with different augmentation techniques, crop styles, and classes weights to form an ensemble of CNNs that represent the Hydra's heads. By doing so, we prompt convergence to different endpoints, which is a desirable aspect for ensembles. With this framework, we were able to reduce the training time while maintaining the classification performance of the ensemble. We created ensembles for our experiments using two state-of-the-art CNN architectures, residual network (ResNet), and dense convolutional networks (DenseNet). We have demonstrated the application of our Hydra framework in two data sets, functional map of world (FMOW) and NWPU-RESISC45, achieving results comparable to the state-of-the-art for the former and the best-reported performance so far for the latter. Code and CNN models are available at https://github.com/maups/hydra-fmow.
引用
收藏
页码:6530 / 6541
页数:12
相关论文
共 41 条
[1]  
Ahmed E, 2015, PROC CVPR IEEE, P3908, DOI 10.1109/CVPR.2015.7299016
[2]  
[Anonymous], 2014, 2 INT C LEARN REPR I
[3]  
[Anonymous], P 3 INT C LEARNING R
[4]  
[Anonymous], 2016, P CVPR
[5]  
[Anonymous], P BMVC
[6]  
[Anonymous], 2017, COMMUN ACM, DOI DOI 10.1145/3065386
[7]  
[Anonymous], 2017, P IEEE, DOI DOI 10.1109/JPROC.2017.2675998
[8]  
[Anonymous], 2017, P ADV NEUR INF PROC
[9]  
[Anonymous], 2017, 5 INT C LEARNING REP
[10]  
[Anonymous], 2011, J. Mach. Learn. Technol