Deep MAnTra: deep learning-based multi-animal tracking for Japanese macaques

被引:0
|
作者
Riza Rae Pineda
Takatomi Kubo
Masaki Shimada
Kazushi Ikeda
机构
[1] Nara Institute of Science and Technology,Department of Computer Science, College of Engineering
[2] University of the Philippines,undefined
[3] Teikyo University of Science,undefined
来源
Artificial Life and Robotics | 2023年 / 28卷
关键词
Computer vision; Multi-instance tracking; Object detection; Animal tracking;
D O I
暂无
中图分类号
学科分类号
摘要
Multi-instance object tracking is an active research problem in computer vision, where most novel methods analyze and locate targets on videos taken from static camera set-ups, just as many existing monitoring systems worldwide. These have proved efficient and effective for many established monitoring systems worldwide, such as animal behavior studies and human and road traffic. However, despite the growing success of computer vision in animal monitoring and behavior analysis, such a system has yet to be developed for free-ranging Japanese macaques. With this, our study aims to establish a tracking system for Japanese macaques in their natural habitat. We begin by training a monkey detector using You Only Look Once (YOLOv4) and investigating the effect of different transfer learning techniques, curriculum learning, and dataset heterogeneity to improve the model’s accuracy. Using the resulting box detections from our monkey detection model, we use SuperGlue and Murty’s algorithm for re-identifying the monkey individuals across the succeeding frames. With a mean AP50\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$AP^{50}$$\end{document} of 96.59%, a precision score of 93%, a recall of 96%, and a mean IOUAP@50\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$IOU_{AP@50}$$\end{document} of 77.2%, our Japanese macaque detection model trained using a YOLO-v4 architecture with spatial attention module, and Mish activation function based on 3-stage training curriculum yielded the best performance. For animal behavior studies, our tracking system can prove effective and reliable with our achieved 91.35% MOTA even on our heterogeneous dataset.
引用
收藏
页码:127 / 138
页数:11
相关论文
共 50 条
  • [1] Deep MAnTra: deep learning-based multi-animal tracking for Japanese macaques
    Pineda, Riza Rae
    Kubo, Takatomi
    Shimada, Masaki
    Ikeda, Kazushi
    ARTIFICIAL LIFE AND ROBOTICS, 2023, 28 (01) : 127 - 138
  • [2] SLEAP: A deep learning system for multi-animal pose tracking
    Pereira, Talmo D.
    Tabris, Nathaniel
    Matsliah, Arie
    Turner, David M.
    Li, Junyu
    Ravindranath, Shruthi
    Papadoyannis, Eleni S.
    Normand, Edna
    Deutsch, David S.
    Wang, Z. Yan
    McKenzie-Smith, Grace C.
    Mitelut, Catalin C.
    Castro, Marielisa Diez
    D'Uva, John
    Kislin, Mikhail
    Sanes, Dan H.
    Kocher, Sarah D.
    Wang, Samuel S-H
    Falkner, Annegret L.
    Shaevitz, Joshua W.
    Murthy, Mala
    NATURE METHODS, 2022, 19 (04) : 486 - +
  • [3] SLEAP: A deep learning system for multi-animal pose tracking
    Talmo D. Pereira
    Nathaniel Tabris
    Arie Matsliah
    David M. Turner
    Junyu Li
    Shruthi Ravindranath
    Eleni S. Papadoyannis
    Edna Normand
    David S. Deutsch
    Z. Yan Wang
    Grace C. McKenzie-Smith
    Catalin C. Mitelut
    Marielisa Diez Castro
    John D’Uva
    Mikhail Kislin
    Dan H. Sanes
    Sarah D. Kocher
    Samuel S.-H. Wang
    Annegret L. Falkner
    Joshua W. Shaevitz
    Mala Murthy
    Nature Methods, 2022, 19 : 486 - 495
  • [4] Publisher Correction: SLEAP: A deep learning system for multi-animal pose tracking
    Talmo D. Pereira
    Nathaniel Tabris
    Arie Matsliah
    David M. Turner
    Junyu Li
    Shruthi Ravindranath
    Eleni S. Papadoyannis
    Edna Normand
    David S. Deutsch
    Z. Yan Wang
    Grace C. McKenzie-Smith
    Catalin C. Mitelut
    Marielisa Diez Castro
    John D’Uva
    Mikhail Kislin
    Dan H. Sanes
    Sarah D. Kocher
    Samuel S.-H. Wang
    Annegret L. Falkner
    Joshua W. Shaevitz
    Mala Murthy
    Nature Methods, 2022, 19 : 628 - 628
  • [5] SLEAP: A deep learning system for multi-animal pose tracking (vol 19, pg 486, 2022)
    Pereira, Talmo D.
    Tabris, Nathaniel
    Matsliah, Arie
    Turner, David M.
    Li, Junyu
    Ravindranath, Shruthi
    Papadoyannis, Eleni S.
    Normand, Edna
    Deutsch, David S.
    Wang, Z. Yan
    McKenzie-Smith, Grace C.
    Mitelut, Catalin C.
    Castro, Marielisa Diez
    D'Uva, John
    Kislin, Mikhail
    Sanes, Dan H.
    Kocher, Sarah D.
    Wang, Samuel S. -H.
    Falkner, Annegret L.
    Shaevitz, Joshua W.
    Murthy, Mala
    NATURE METHODS, 2022, 19 (05) : 628 - 628
  • [6] DEEP LEARNING-BASED TRACKING OF MULTIPLE OBJECTS IN THE CONTEXT OF FARM ANIMAL ETHOLOGY
    Ali, R.
    Dorozynski, M.
    Stracke, J.
    Mehltretter, M.
    XXIV ISPRS CONGRESS IMAGING TODAY, FORESEEING TOMORROW, COMMISSION II, 2022, 43-B2 : 509 - 516
  • [7] Development of deep multi-animal tracking for domestic cats toward future application to social behavior analysis
    Nakajima, Nina
    Koyasu, Hikari
    Maruno, Yuki
    Nagasawa, Miho
    Kikusui, Takefumi
    Kubo, Takatomi
    ADVANCED ROBOTICS, 2024, 38 (14) : 958 - 966
  • [8] AnimalTrack: A Benchmark for Multi-Animal Tracking in the Wild
    Libo Zhang
    Junyuan Gao
    Zhen Xiao
    Heng Fan
    International Journal of Computer Vision, 2023, 131 : 496 - 513
  • [9] A review on deep learning-based object tracking methods
    Uke, Nilesh
    Futane, Pravin
    Deshpande, Neeta
    Uke, Shailaja
    MULTIAGENT AND GRID SYSTEMS, 2024, 20 (01) : 27 - 39
  • [10] Deep Learning-Based Multi-class Multiple Object Tracking in UAV Video
    Micheal, A. Ancy
    Vani, K.
    JOURNAL OF THE INDIAN SOCIETY OF REMOTE SENSING, 2022, 50 (12) : 2543 - 2552