Scene-specific convolutional neural networks for video-based biodiversity detection

被引:38
作者
Weinstein, Ben G. [1 ]
机构
[1] Oregon State Univ, Marine Mammal Inst, Dept Fisheries & Wildlife, Newport, OR 97365 USA
来源
METHODS IN ECOLOGY AND EVOLUTION | 2018年 / 9卷 / 06期
关键词
automated monitoring; computer vision; hummingbirds; neural networks; remote cameras;
D O I
10.1111/2041-210X.13011
中图分类号
Q14 [生态学(生物生态学)];
学科分类号
071012 ; 0713 ;
摘要
1. Finding, counting and identifying animals is a central challenge in ecology. Most studies are limited by the time and cost of fieldwork by human observers. To increase the spatial and temporal breadth of sampling, ecologists are adopting passive image-based monitoring approaches. While passive monitoring can expand data collection, a remaining obstacle is finding the small proportion of images containing ecological objects among the majority of frames containing only background scenes. 2. I proposed a scene-specific convolutional neural network for detecting animals of interest within long duration time-lapse videos. Convolutional neural networks are a type of deep learning algorithm that have recently made significant advances in image classification. 3. The approach was tested on videos of floral visitation by hummingbirds. Despite low frame rates, poor image quality, and complex video conditions, the model correctly classified over 90% of frames containing hummingbirds. Combining motion detection and image classification can substantially reduce the time investment in scoring images from passive monitoring studies. 4. These results underscore the promise of deep learning to lead ecology into greater automation using passive image analysis. To help facilitate future applications, I created a desktop executable that can be used to apply pre-trained models to videos, as well as reproducible scripts for training new models on local and cloud environments.
引用
收藏
页码:1435 / 1441
页数:7
相关论文
共 29 条
[11]  
Deng J, 2009, PROC CVPR IEEE, P248, DOI 10.1109/CVPRW.2009.5206848
[12]  
Elias Andy Rosales, 2017, 2017 IEEE/ACM Second International Conference on Internet-of-Things Design and Implementation (IoTDI), P247, DOI 10.1145/3054977.3054986
[13]   Arboreal camera trapping: taking a proven method to new heights [J].
Gregory, Tremaine ;
Rueda, Farah Carrasco ;
Deichmann, Jessica ;
Kolowski, Joseph ;
Alonso, Alfonso .
METHODS IN ECOLOGY AND EVOLUTION, 2014, 5 (05) :443-451
[14]   Deep-learning Versus OBIA for Scattered Shrub Detection with Google Earth Imagery: Ziziphus lotus as Case Study [J].
Guirado, Emilio ;
Tabik, Siham ;
Alcaraz-Segura, Domingo ;
Cabello, Javier ;
Herrera, Francisco .
REMOTE SENSING, 2017, 9 (12)
[15]   Deep learning [J].
LeCun, Yann ;
Bengio, Yoshua ;
Hinton, Geoffrey .
NATURE, 2015, 521 (7553) :436-444
[16]   Towards better exploiting convolutional neural networks for remote sensing scene classification [J].
Nogueira, Keiller ;
Penatti, Otavio A. B. ;
dos Santos, Jefersson A. .
PATTERN RECOGNITION, 2017, 61 :539-556
[17]   Emerging Technologies to Conserve Biodiversity [J].
Pimm, Stuart L. ;
Alibhai, Sky ;
Bergl, Richard ;
Dehgan, Alex ;
Giri, Chandra ;
Jewell, Zoe ;
Joppa, Lucas ;
Kays, Roland ;
Loarie, Scott .
TRENDS IN ECOLOGY & EVOLUTION, 2015, 30 (11) :685-696
[18]   Ensemble Video Object Cut in Highly Dynamic Scenes [J].
Ren, Xiaobo ;
Han, Tony X. ;
He, Zhihai .
2013 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2013, :1947-1954
[19]   Baited remote underwater video as a promising nondestructive tool to assess fish assemblages in clearwater Amazonian rivers: testing the effect of bait and habitat type [J].
Schmid, Kurt ;
Reis-Filho, Jose Amorim ;
Harvey, Euan ;
Giarrizzo, Tommaso .
HYDROBIOLOGIA, 2017, 784 (01) :93-109
[20]  
Srivastava N, 2014, J MACH LEARN RES, V15, P1929