Real-time motion tracking using optical flow on multiple GPUs

被引:38
|
作者
Mahmoudi, S. A. [1 ]
Kierzynka, M. [2 ,3 ]
Manneback, P. [1 ]
Kurowski, K. [2 ]
机构
[1] Univ Mons, B-7000 Mons, Belgium
[2] Poznan Supercomp & Networking Ctr, PL-61704 Poznan, Poland
[3] Poznan Univ Tech, PL-60965 Poznan, Poland
关键词
the Lucas-Kanade method; sparse optical flow; multiple GPU computations; HIDDEN MARKOV-MODELS; ALGORITHM; FRAMEWORK;
D O I
10.2478/bpasts-2014-0016
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Motion tracking algorithms are widely used in computer vision related research. However, the new video standards, especially those in high resolutions, cause that current implementations, even running on modern hardware, no longer meet the needs of real-time processing. To overcome this challenge several GPU (Graphics Processing Unit) computing approaches have recently been proposed. Although they present a great potential of a GPU platform, hardly any is able to process high definition video sequences efficiently. Thus, a need arose to develop a tool being able to address the outlined problem. In this paper we present software that implements optical flow motion tracking using the Lucas-Kanade algorithm. It is also integrated with the Harris corner detector and therefore the algorithm may perform sparse tracking, i.e. tracking of the meaningful pixels only. This allows to substantially lower the computational burden of the method. Moreover, both parts of the algorithm, i.e. corner selection and tracking, are implemented on GPU and, as a result, the software is immensely fast, allowing for real-time motion tracking on videos in Full HD or even 4K format. In order to deliver the highest performance, it also supports multiple GPU systems, where it scales up very well.
引用
收藏
页码:139 / 150
页数:12
相关论文
共 50 条
  • [31] Real-time multiple people tracking using competitive condensation
    Kang, HG
    Kim, D
    PATTERN RECOGNITION, 2005, 38 (07) : 1045 - 1058
  • [32] Real-time multiple people tracking using competitive condensation
    Kang, HG
    Kim, DJ
    Bang, SY
    2002 INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, VOL III, PROCEEDINGS, 2002, : 325 - 328
  • [33] Real-Time Human Motion Detection and Tracking
    Zarka, Nizar
    Alhalah, Ziad
    Deeb, Rada
    2008 3RD INTERNATIONAL CONFERENCE ON INFORMATION AND COMMUNICATION TECHNOLOGIES: FROM THEORY TO APPLICATIONS, VOLS 1-5, 2008, : 1056 - +
  • [34] Real-time multiple people tracking using competitive condensation
    Kang, Heegu
    Kim, Daijin
    Bang, Sung Yang
    Proceedings - International Conference on Pattern Recognition, 2002, 16 (01): : 413 - 416
  • [35] Real-Time Object Tracking with Motion Information
    Wang, Chaoqun
    Sun, Xiaoyan
    Chen, Xuejin
    Zeng, Wenjun
    2018 IEEE INTERNATIONAL CONFERENCE ON VISUAL COMMUNICATIONS AND IMAGE PROCESSING (IEEE VCIP), 2018,
  • [36] Real-time compressive tracking with motion estimation
    Wu, Jiayun
    Chen, Daquan
    Yi, Rui
    2013 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (ROBIO), 2013, : 2374 - 2379
  • [37] An Agile Framework for Real-Time Motion Tracking
    Basu, Saikat
    DiBiano, Robert
    Karki, Manohar
    Stagg, Malcolm
    Weltman, Jerry
    Mukhopadhyay, Supratik
    Ganguly, Sangram
    IEEE 39TH ANNUAL COMPUTER SOFTWARE AND APPLICATIONS CONFERENCE WORKSHOPS (COMPSAC 2015), VOL 3, 2015, : 205 - 210
  • [38] Real-time detection method of human motion based on optical flow
    Shi, Jia-Dong
    Wang, Jian-Zhong
    Wang, Hong-Ru
    Beijing Ligong Daxue Xuebao/Transaction of Beijing Institute of Technology, 2008, 28 (09): : 794 - 797
  • [39] Face Tracking using Optical Flow Development of a Real-Time AdaBoost Cascade Face Tracker
    Ranftl, Andreas
    Alonso-Fernandez, Fernando
    Karlsson, Stefan
    BIOSIG 2015 PROCEEDINGS OF THE 14TH INTERNATIONAL CONFERENCE OF THE BIOMETRICS SPECIAL INTEREST GROUP, 2015,
  • [40] Real-Time Optical Flow Estimation Using Multiple Frame-Straddling Intervals
    Chen, Lei
    Yang, Hua
    Takaki, Takeshi
    Ishii, Idaku
    JOURNAL OF ROBOTICS AND MECHATRONICS, 2012, 24 (04) : 686 - 698