Context-Aware Action with a Small Mobile Robot

被引:0
作者
Withey, Daniel [1 ]
Mogokonyane, Katlego [1 ]
Tikam, Mayur [1 ]
Holder, Ross [2 ]
Veeraragoo, Mahalingam [2 ]
Gambushe, Mxolisi [1 ]
机构
[1] CSIR, Mobile Intelligent Autonomous Syst, Pretoria, South Africa
[2] CSIR, Informat Secur Res Ctr, Pretoria, South Africa
来源
2020 INTERNATIONAL SAUPEC/ROBMECH/PRASA CONFERENCE | 2020年
关键词
Industry; 4.0; 4IR; mobile GPU; deep neural networks; mobile robotics; autonomous inspection;
D O I
10.1109/saupec/robmech/prasa48453.2020.9041114
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Simultaneous advances in mobile GPU computing and real-time object recognition now enable machines to make decisions and take actions based on the detection of objects of interest in the environment. An implementation of a mobile robot system that combines autonomous exploration and mapping capabilities with a real-time object recognition method based on a deep neural network running on a mobile GPU, is described. The system is able to detect objects of interest and then take real-time actions to interact with the objects, in this case, by moving to acquire inspection-style images of the object, from multiple angles. The robot system is small, self-contained and runs on battery power. The system shows the potential for the development of robotic systems with context awareness, permitting advanced autonomy.
引用
收藏
页码:410 / 415
页数:6
相关论文
共 8 条
  • [1] [Anonymous], 2018, ARXIV PREPRINT ARXIV
  • [2] He K., 2018, ARXIV170306870
  • [3] OctoMap: an efficient probabilistic 3D mapping framework based on octrees
    Hornung, Armin
    Wurm, Kai M.
    Bennewitz, Maren
    Stachniss, Cyrill
    Burgard, Wolfram
    [J]. AUTONOMOUS ROBOTS, 2013, 34 (03) : 189 - 206
  • [4] Redmon J, 2018, Arxiv, DOI [arXiv:1804.02767, DOI 10.1109/CVPR.2017.690]
  • [5] You Only Look Once: Unified, Real-Time Object Detection
    Redmon, Joseph
    Divvala, Santosh
    Girshick, Ross
    Farhadi, Ali
    [J]. 2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, : 779 - 788
  • [6] Ren S., 2015, ADV NEURAL INF PROCE, P91
  • [7] Szemenyei M., 2019, ARXIV191010949
  • [8] Tian L., 2019, LECT NOTES COMPUTER, V11542