Action Detection for Wildlife Monitoring with Camera Traps Based on Segmentation with Filtering of Tracklets (SWIFT) and Mask-Guided Action Recognition (MAROON)

被引:6
作者
Schindler, Frank [1 ]
Steinhage, Volker [1 ]
van Beeck Calkoen, Suzanne T. S. [2 ,3 ,4 ]
Heurich, Marco [2 ,5 ,6 ]
机构
[1] Univ Bonn, Dept Comp Sci 4, Friedrich Hirzebruch Allee 8, D-53115 Bonn, Germany
[2] Bavarian Forest Natl Pk, Dept Natl Pk Monitoring & Anim Management, Freyunger Str 2, D-94481 Grafenau, Germany
[3] Tech Univ Dresden, Inst Forest Bot & Forest Zool, Forest Zool, Pienner Str 7, D-01737 Tharandt, Germany
[4] Univ Goettingen, Fac Forest Sci & Forest Ecol, Wildlife Sci, Buesgenweg 3, D-37077 Gottingen, Germany
[5] Univ Freiburg, Fac Environm & Nat Resources, Tennenbacher Str 4, D-79106 Freiburg, Germany
[6] Inland Norway Univ Appl Sci, Inst Forestry & Wildlife Management, NO-2480 Koppang, Norway
来源
APPLIED SCIENCES-BASEL | 2024年 / 14卷 / 02期
关键词
wildlife monitoring; deep learning; video instance segmentation; mask-supported action recognition; triple-stream convolutional neural network; action detection for deer; BEHAVIOR;
D O I
10.3390/app14020514
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Behavioral analysis of animals in the wild plays an important role for ecological research and conservation and has been mostly performed by researchers. We introduce an action detection approach that automates this process by detecting animals and performing action recognition on the detected animals in camera trap videos. Our action detection approach is based on SWIFT (segmentation with filtering of tracklets), which we have already shown to successfully detect and track animals in wildlife videos, and MAROON (mask-guided action recognition), an action recognition network that we are introducing here. The basic ideas of MAROON are the exploitation of the instance masks detected by SWIFT and a triple-stream network. The instance masks enable more accurate action recognition, especially if multiple animals appear in a video at the same time. The triple-stream approach extracts features for the motion and appearance of the animal. We evaluate the quality of our action recognition on two self-generated datasets, from an animal enclosure and from the wild. These datasets contain videos of red deer, fallow deer and roe deer, recorded both during the day and night. MAROON improves the action recognition accuracy compared to other state-of-the-art approaches by an average of 10 percentage points on all analyzed datasets and achieves an accuracy of 69.16% on the Rolandseck Daylight dataset, in which 11 different action classes occur. Our action detection system makes it possible todrasticallyreduce the manual work of ecologists and at the same time gain new insights through standardized results.
引用
收藏
页数:17
相关论文
共 64 条
[41]  
Ryoo MSS, 2022, Arxiv, DOI [arXiv:2106.11297, 10.48550/arXiv.2106.11297]
[42]  
Sakib F, 2020, Arxiv, DOI [arXiv:2011.10759, DOI 10.48550/ARXIV.2011.10759]
[43]   Instance segmentation and tracking of animals in wildlife videos: SWIFT-segmentation with filtering of tracklets [J].
Schindler, Frank ;
Steinhage, Volker .
ECOLOGICAL INFORMATICS, 2022, 71
[44]   Identification of animals and recognition of their actions in wildlife videos using deep learning techniques [J].
Schindler, Frank ;
Steinhage, Volker .
ECOLOGICAL INFORMATICS, 2021, 61
[45]  
Sushmit AS, 2020, Arxiv, DOI arXiv:2008.08452
[46]  
Sheth I, 2021, Arxiv, DOI arXiv:2104.13051
[47]  
Simonyan K, 2014, ADV NEUR IN, V27
[48]   REVIVING ITERATIVE TRAINING WITH MASK GUIDANCE FOR INTERACTIVE SEGMENTATION [J].
Sofiiuk, Konstantin ;
Petrov, Ilya A. ;
Konushin, Anton .
2022 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2022, :3141-3145
[49]   Asynchronous Interaction Aggregation for Action Detection [J].
Tang, Jiajun ;
Xia, Jin ;
Mu, Xinzhi ;
Pang, Bo ;
Lu, Cewu .
COMPUTER VISION - ECCV 2020, PT XV, 2020, 12360 :71-87
[50]   Further notes on the analysis of mammal inventory data collected with camera traps [J].
Tobler, M. W. ;
Carrillo-Percastegui, S. E. ;
Pitman, R. Leite ;
Mares, R. ;
Powell, G. .
ANIMAL CONSERVATION, 2008, 11 (03) :187-189