DEEP LEARNING-BASED TRACKING OF MULTIPLE OBJECTS IN THE CONTEXT OF FARM ANIMAL ETHOLOGY

被引:1
|
作者
Ali, R. [1 ]
Dorozynski, M. [1 ]
Stracke, J. [2 ]
Mehltretter, M. [1 ]
机构
[1] Leibniz Univ Hannover, Inst Photogrammetry & GeoInformat, Hannover, Germany
[2] Univ Bonn, Inst Anim Sci, Bonn, Germany
关键词
Image Sequence Analysis; Multi-Object Tracking; Tracktor; Animal Science; Poultry Tracking; MULTITARGET;
D O I
10.5194/isprs-archives-XLIII-B2-2022-509-2022
中图分类号
P9 [自然地理学];
学科分类号
0705 ; 070501 ;
摘要
Automatic detection and tracking of individual animals is important to enhance their welfare and to improve our understanding of their behaviour. Due to methodological difficulties, especially in the context of poultry tracking, it is a challenging task to automatically recognise and track individual animals. Those difficulties can be, for example, the similarity of animals of the same species which makes distinguishing between them harder, or sudden changes in their body shape which may happen due to putting on or spreading out the wings in a very short period of time. In this paper, an automatic poultry tracking algorithm is proposed. This algorithm is based on the well-known tracktor approach and tackles multi-object tracking by exploiting the regression head of the Faster R-CNN model to perform temporal realignment of object bounding boxes. Additionally, we use a multi-scale reidentification model to improve the re-association of the detected animals. For evaluating the performance of the proposed method in this study, a novel dataset consisting of seven image sequences that show chicks in an average pen farm in different stages of growth is used.
引用
收藏
页码:509 / 516
页数:8
相关论文
共 50 条
  • [1] Deep learning-based multiple particle tracking in complex system
    Xu, Xiaoming
    Wei, Jianjun
    Sang, Sheng
    AIP ADVANCES, 2024, 14 (01)
  • [2] Deep MAnTra: deep learning-based multi-animal tracking for Japanese macaques
    Pineda, Riza Rae
    Kubo, Takatomi
    Shimada, Masaki
    Ikeda, Kazushi
    ARTIFICIAL LIFE AND ROBOTICS, 2023, 28 (01) : 127 - 138
  • [3] Deep MAnTra: deep learning-based multi-animal tracking for Japanese macaques
    Riza Rae Pineda
    Takatomi Kubo
    Masaki Shimada
    Kazushi Ikeda
    Artificial Life and Robotics, 2023, 28 : 127 - 138
  • [4] Live Demonstration: Deep Learning-Based Visual Tracking of Multiple Objects on a Low-Power Embedded System
    Blanco-Filgueira, Beatriz
    Garcia-Lesta, Daniel
    Fernandez-Sanjurjo, Mauro
    Brea, Victor M.
    Lopez, Paula
    2019 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), 2019,
  • [5] Deep Learning-based Multiple Objects Detection and Tracking System for Socially Aware Mobile Robot Navigation Framework
    Do Nam Thang
    Lan Anh Nguyen
    Pham Trung Dung
    Truong Dang Khoa
    Nguyen Huu Son
    Nguyen Tran Hiep
    Pham Van Nguyen
    Vu Duc Truong
    Dinh Hong Toan
    Nguyen Manh Hung
    Trung-Dung Ngo
    Xuan-Tung Truong
    PROCEEDINGS OF 2018 5TH NAFOSTED CONFERENCE ON INFORMATION AND COMPUTER SCIENCE (NICS 2018), 2018, : 436 - 441
  • [6] A Structured Learning-Based Graph Matching Method for Tracking Dynamic Multiple Objects
    Xiong, Hongkai
    Zheng, Dayu
    Zhu, Qingxiang
    Wang, Botao
    Zheng, Yuan F.
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2013, 23 (03) : 534 - 548
  • [7] Deep learning in multiple animal tracking: A survey
    Liu, Yeqiang
    Li, Weiran
    Liu, Xue
    Li, Zhenbo
    Yue, Jun
    COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2024, 224
  • [8] Learning multiple instance deep representation for objects tracking
    Li, Chunyu
    Li, Gang
    JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2020, 71
  • [9] Imitation Learning-Based Visual Servoing for Tracking Moving Objects
    Felici, Rocco
    Saveriano, Matteo
    Roveda, Loris
    Paolillo, Antonio
    HUMAN-FRIENDLY ROBOTICS 2023, HFR 2023, 2024, 29 : 110 - 122
  • [10] Deep Learning-Based Multi-class Multiple Object Tracking in UAV Video
    Micheal, A. Ancy
    Vani, K.
    JOURNAL OF THE INDIAN SOCIETY OF REMOTE SENSING, 2022, 50 (12) : 2543 - 2552