Selecting and Commanding Groups in a Multi-Robot Vision Based System

被引:9
|
作者
Milligan, Brian [1 ]
Mori, Greg [1 ]
Vaughan, Richard [1 ]
机构
[1] Simon Fraser Univ, Sch Comp Sci, Burnaby, BC, Canada
来源
PROCEEDINGS OF THE 6TH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTIONS (HRI 2011) | 2011年
关键词
Computer Vision; Multiple Robots; Selection; Pointing; Task allocation and coordination; User Feedback;
D O I
10.1145/1957656.1957809
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
We present a novel method for a human user to select groups of robots without using any external instruments. We use computer vision techniques to read hand gestures from a user and use the hand gesture information to select single or multiple robots from a population and assign them to a task. To select robots the user simply draws a circle in the air around the robots that the user wants to command. Once the user selects the group of robots, he or she can send them to a location by pointing to a target location. To achieve this we use cameras mounted on mobile robots to find the user's face and then track his or her hand. Our method exploits an observation from human-robot interaction on pointing, which found a human's target when pointing is best inferred using the line from the human's eyes to the user's extended hand[1]. When circling robots the projected eye-to-hand lines forms a cone-like shape that envelops the selected robots. From a 2D camera mounted on the robot, this cone is seen with the user's face as the vertex and the hand movements as a circular slice of the cone. We show in the video how the robots can tell if they have been selected by testing to see if the face is within the circle made by the hand. If the face is within the circle then the robot was selected, if the face is outside the circle it was not selected. Following selection the robots then read a command by looking for a pointing gesture, which is detected by an outreached hand. From the pointing gesture the robots collectively infer which target is pointing at by calculating the distance and direction that the hand moved to relative to the face. The selected robots then travel to the target, and unselected robots can then be selected and commanded as desired. The robots communicate their state to the user through LED lights on the robots chassis. When a robot is searching for the user's face the LEDs flash to get the user's attention (as it is easiest to find frontal faces). When the robots find the users face the lights become a solid yellow to indicate that they are ready to be selected. When selected, the robots' LEDs turn blue to indicate they can now be commanded. Once robots are sent off to a location, remaining robots can then be selected and assigned another task. We demonstrate this method working on low powered Atom Netbooks and off the shelf USB web cameras. This shows the first working implementation of a system that allows a human to select and command groups of robots with out using any external instruments.
引用
收藏
页码:415 / 415
页数:1
相关论文
共 50 条
  • [31] A MULTI-ROBOT, COOPERATIVE, AND ACTIVE SLAM ALGORITHM FOR EXPLORATION
    Pham, Viet-Cuong
    Juang, Jyh-Ching
    INTERNATIONAL JOURNAL OF INNOVATIVE COMPUTING INFORMATION AND CONTROL, 2013, 9 (06): : 2567 - 2583
  • [32] A New Framework of Optimal Multi-Robot Formation Problem
    Ma Hongbin
    Wang Meiling
    Jia Zhenchao
    Yang Chenguang
    2011 30TH CHINESE CONTROL CONFERENCE (CCC), 2011, : 4139 - 4144
  • [33] On the Impact of Interruptions During Multi-Robot Supervision Tasks
    Dahiya, Abhinav
    Cai, Yifan
    Schneider, Oliver
    Smith, Stephen L.
    2023 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2023), 2023, : 9771 - 9777
  • [34] Deterministic annealing with Potts neurons for multi-robot routing
    David, Jennifer
    Rognvaldsson, Thorsteinn
    Soderberg, Bo
    Ohlsson, Mattias
    INTELLIGENT SERVICE ROBOTICS, 2022, 15 (03) : 321 - 334
  • [35] Multi-Robot Path Planning for Each Robot with Several Jobs in a Single Trip
    Hu, Biao
    Xu, Shengjie
    Cao, Zhengcai
    IFAC PAPERSONLINE, 2020, 53 (05): : 279 - 284
  • [36] A Method for Distributed Multi-Robot Simultaneous Localization and Mapping
    Jiang, Jie
    Li Tuan-jie
    2012 INTERNATIONAL CONFERENCE ON INDUSTRIAL CONTROL AND ELECTRONICS ENGINEERING (ICICEE), 2012, : 358 - 362
  • [37] Dynamic Leader Allocation in Multi-robot Systems Based on Nonlinear Model Predictive Control
    Tavares, Augusto de Holanda B. M.
    Madruga, Sarah Pontes
    Brito, Alisson V.
    Nascimento, Tiago P.
    JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS, 2020, 98 (02) : 359 - 376
  • [38] A Vision-based Navigation System of Mobile Tracking Robot
    Wu, Jie
    Snasel, Vaclav
    Abraham, Ajith
    IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC 2010), 2010,
  • [39] A robust Monte-Carlo algorithm for multi-robot localization
    Amiranashvili, V
    Lakemeyer, G
    MULTI-ROBOT SYSTEMS - FROM SWARMS TO INTELLIGENT AUTOMATA VOL III, 2005, : 251 - 256
  • [40] Multi-robot cooperative edge detection using Kalman filtering
    Song, KT
    Chen, HT
    INTELLIGENT AUTOMATION AND SOFT COMPUTING, 2004, 10 (04) : 295 - 306