A Real-Time Multi-Stage Architecture for Pose Estimation of Zebrafish Head with Convolutional Neural Networks

被引:1
|
作者
Huang, Zhang-Jin [1 ,2 ,3 ]
He, Xiang-Xiang [1 ]
Wang, Fang-Jun [1 ,2 ]
Shen, Qing [1 ]
机构
[1] Univ Sci & Technol China, Sch Comp Sci & Technol, Hefei 230027, Peoples R China
[2] Univ Sci & Technol China, Sch Data Sci, Hefei 230027, Peoples R China
[3] Anhui Prov Key Lab Software Comp & Commun, Hefei 230027, Peoples R China
基金
中国国家自然科学基金;
关键词
convolutional neural network; pose estimation; real-time; zebrafish;
D O I
10.1007/s11390-021-9599-5
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
In order to conduct optical neurophysiology experiments on a freely swimming zebrafish, it is essential to quantify the zebrafish head to determine exact lighting positions. To efficiently quantify a zebrafish head's behaviors with limited resources, we propose a real-time multi-stage architecture based on convolutional neural networks for pose estimation of the zebrafish head on CPUs. Each stage is implemented with a small neural network. Specifically, a light-weight object detector named Micro-YOLO is used to detect a coarse region of the zebrafish head in the first stage. In the second stage, a tiny bounding box refinement network is devised to produce a high-quality bounding box around the zebrafish head. Finally, a small pose estimation network named tiny-hourglass is designed to detect keypoints in the zebrafish head. The experimental results show that using Micro-YOLO combined with RegressNet to predict the zebrafish head region is not only more accurate but also much faster than Faster R-CNN which is the representative of two-stage detectors. Compared with DeepLabCut, a state-of-the-art method to estimate poses for user-defined body parts, our multi-stage architecture can achieve a higher accuracy, and runs 19x faster than it on CPUs.
引用
收藏
页码:434 / 444
页数:11
相关论文
共 50 条
  • [11] Grasping Pose Estimation for Robots Based on Convolutional Neural Networks
    Zheng, Tianjiao
    Wang, Chengzhi
    Wan, Yanduo
    Zhao, Sikai
    Zhao, Jie
    Shan, Debin
    Zhu, Yanhe
    MACHINES, 2023, 11 (10)
  • [12] HEAD DETECTION BASED ON CONVOLUTIONAL NEURAL NETWORK WITH MULTI-STAGE WEIGHTED FEATURE
    Rui, Ting
    Fei, Jian-chao
    Cui, Peng
    Zhou, You
    Fang, Hu-sheng
    2015 IEEE CHINA SUMMIT & INTERNATIONAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING, 2015, : 147 - 150
  • [13] Depth Data Filtering for Real-time Head Pose Estimation with Kinect
    Qiao Ti-zhou
    Dai Shu-ling
    2013 6TH INTERNATIONAL CONGRESS ON IMAGE AND SIGNAL PROCESSING (CISP), VOLS 1-3, 2013, : 953 - 958
  • [14] Head Pose Estimation in the Wild Assisted by Facial Landmarks Based on Convolutional Neural Networks
    Xia, Jiahao
    Cao, Libo
    Zhang, Guanjun
    Liao, Jiacai
    IEEE ACCESS, 2019, 7 : 48470 - 48483
  • [15] Robust Real-time Head Pose Estimation for 10 Watt SBC
    Wassef E.
    El Munim H.E.A.
    Hammad S.
    Ghoneima M.
    International Journal of Advanced Computer Science and Applications, 2021, 12 (07): : 578 - 585
  • [16] Dynamic random regression forests for real-time head pose estimation
    Ying, Ying
    Wang, Han
    MACHINE VISION AND APPLICATIONS, 2013, 24 (08) : 1705 - 1719
  • [17] Dynamic random regression forests for real-time head pose estimation
    Ying Ying
    Han Wang
    Machine Vision and Applications, 2013, 24 : 1705 - 1719
  • [18] Lightweight Architecture for Real-Time Hand Pose Estimation with Deep Supervision
    Wu, Yufei
    Ruan, Xiaofei
    Zhang, Yu
    Zhou, Huang
    Du, Shengyu
    Wu, Gang
    SYMMETRY-BASEL, 2019, 11 (04):
  • [19] Robust Real-time Head Pose Estimation for 10 Watt SBC
    Wassef, Emad
    Munim, Hossam E. Abd El
    Hammad, Sherif
    Ghoneima, Maged
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2021, 12 (07) : 578 - 585
  • [20] Real-Time Head Pose Estimation Using Weighted Random Forests
    Kim, Hyunduk
    Sohn, Myoung-Kyu
    Kim, Dong-Ju
    Ryu, Nuri
    COMPUTATIONAL COLLECTIVE INTELLIGENCE: TECHNOLOGIES AND APPLICATIONS, ICCCI 2014, 2014, 8733 : 554 - 562