Event Camera Based Real-Time Detection and Tracking of Indoor Ground Robots

被引:18
|
作者
Iaboni, Craig [1 ]
Patel, Himanshu [1 ]
Lobo, Deepan [2 ]
Choi, Ji-Won [2 ]
Abichandani, Pramod [3 ]
机构
[1] New Jersey Inst Technol, Ying Wu Coll Comp YWCC, Newark, NJ 07103 USA
[2] New Jersey Inst Technol, Dept Elect & Comp Engn, Newark Coll Engn NCE, Newark, NJ 07103 USA
[3] New Jersey Inst Technol, Newark Coll Engn NCE, Robot & Data Lab RADLab, Newark, NJ 07103 USA
关键词
Event cameras; multi-robot systems; detection and tracking; clustering and pattern recognition; NEUROMORPHIC VISION; LATENCY; ALGORITHM; SENSOR; PIXEL; SHIFT;
D O I
10.1109/ACCESS.2021.3133533
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper presents a real-time method to detect and track multiple mobile ground robots using event cameras. The method uses density-based spatial clustering of applications with noise (DBSCAN) to detect the robots and a single k-dimensional (k - d) tree to accurately keep track of them as they move in an indoor arena. Robust detections and tracks are maintained in the face of event camera noise and lack of events (due to robots moving slowly or stopping). An off-the-shelf RGB camera-based tracking system was used to provide ground truth. Experiments including up to 4 robots are performed to study the effect of i) varying DBSCAN parameters, ii) the event accumulation time, iii) the number of robots in the arena, iv) the speed of the robots, v) variation in ambient light conditions on the detection and tracking performance, and vi) the effect of alternative clustering algorithms on detection performance. The experimental results showed 100% detection and tracking fidelity in the face of event camera noise and robots stopping for tests involving up to 3 robots (and upwards of 93% for 4 robots). When the lighting conditions were varied, a graceful degradation in detection and tracking fidelity was observed.
引用
收藏
页码:166588 / 166602
页数:15
相关论文
共 50 条
  • [31] CU-Track: A Multi-Camera Framework for Real-Time Multi-Object Tracking
    Bamrungthai, Pongsakon
    Sangveraphunsiri, Viboon
    AUTOMATIC CONTROL AND MECHATRONIC ENGINEERING II, 2013, 415 : 325 - 332
  • [32] Real-time distributed trajectory planning for mobile robots
    Nguyen, Binh
    Nghiem, Truong
    Nguyen, Linh
    Nguyen, Anh Tung
    Nguyen, Thang
    IFAC PAPERSONLINE, 2023, 56 (02): : 2152 - 2157
  • [33] Machine learning-based real-time tracking for concrete vibration
    Quan, Yuhu
    Wang, Fenglai
    AUTOMATION IN CONSTRUCTION, 2022, 140
  • [34] Real-time detection of track fasteners based on object detection and FPGA
    Xiao, Tian
    Xu, Tianhua
    Wang, Guang
    MICROPROCESSORS AND MICROSYSTEMS, 2023, 100
  • [35] Real-time text tracking in natural scenes
    Merino-Gracia, Carlos
    Mirmehdi, Majid
    IET COMPUTER VISION, 2014, 8 (06) : 670 - 681
  • [36] A robust, real-time camera-based eye gaze tracking system to analyze users' visual attention using deep learning
    Singh, Jaiteg
    Modi, Nandini
    INTERACTIVE LEARNING ENVIRONMENTS, 2024, 32 (02) : 409 - 430
  • [37] Real-time camera-based eye gaze tracking using convolutional neural network: a case study on social media website
    Modi, Nandini
    Singh, Jaiteg
    VIRTUAL REALITY, 2022, 26 (04) : 1489 - 1506
  • [38] Real-time gait event detection for normal subjects from lower trunk accelerations
    Gonzalez, Rafael C.
    Lopez, Antonio M.
    Rodriguez-Uria, Javier
    Alvarez, Diego
    Alvarez, Juan C.
    GAIT & POSTURE, 2010, 31 (03) : 322 - 325
  • [39] Novel method for real-time detection and tracking of pig body and its different parts
    Chen, Fuen
    Liang, Xiaoming
    Chen, Longhan
    Liu, Baoyuan
    Lan, Yubin
    INTERNATIONAL JOURNAL OF AGRICULTURAL AND BIOLOGICAL ENGINEERING, 2020, 13 (06) : 144 - 149
  • [40] Real-time automatic crack detection method based on drone
    Meng, Shiqiao
    Gao, Zhiyuan
    Zhou, Ying
    He, Bin
    Djerrad, Abderrahim
    COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, 2023, 38 (07) : 849 - 872