Selecting and Commanding Groups in a Multi-Robot Vision Based System

被引:9
|
作者
Milligan, Brian [1 ]
Mori, Greg [1 ]
Vaughan, Richard [1 ]
机构
[1] Simon Fraser Univ, Sch Comp Sci, Burnaby, BC, Canada
来源
PROCEEDINGS OF THE 6TH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTIONS (HRI 2011) | 2011年
关键词
Computer Vision; Multiple Robots; Selection; Pointing; Task allocation and coordination; User Feedback;
D O I
10.1145/1957656.1957809
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
We present a novel method for a human user to select groups of robots without using any external instruments. We use computer vision techniques to read hand gestures from a user and use the hand gesture information to select single or multiple robots from a population and assign them to a task. To select robots the user simply draws a circle in the air around the robots that the user wants to command. Once the user selects the group of robots, he or she can send them to a location by pointing to a target location. To achieve this we use cameras mounted on mobile robots to find the user's face and then track his or her hand. Our method exploits an observation from human-robot interaction on pointing, which found a human's target when pointing is best inferred using the line from the human's eyes to the user's extended hand[1]. When circling robots the projected eye-to-hand lines forms a cone-like shape that envelops the selected robots. From a 2D camera mounted on the robot, this cone is seen with the user's face as the vertex and the hand movements as a circular slice of the cone. We show in the video how the robots can tell if they have been selected by testing to see if the face is within the circle made by the hand. If the face is within the circle then the robot was selected, if the face is outside the circle it was not selected. Following selection the robots then read a command by looking for a pointing gesture, which is detected by an outreached hand. From the pointing gesture the robots collectively infer which target is pointing at by calculating the distance and direction that the hand moved to relative to the face. The selected robots then travel to the target, and unselected robots can then be selected and commanded as desired. The robots communicate their state to the user through LED lights on the robots chassis. When a robot is searching for the user's face the LEDs flash to get the user's attention (as it is easiest to find frontal faces). When the robots find the users face the lights become a solid yellow to indicate that they are ready to be selected. When selected, the robots' LEDs turn blue to indicate they can now be commanded. Once robots are sent off to a location, remaining robots can then be selected and assigned another task. We demonstrate this method working on low powered Atom Netbooks and off the shelf USB web cameras. This shows the first working implementation of a system that allows a human to select and command groups of robots with out using any external instruments.
引用
收藏
页码:415 / 415
页数:1
相关论文
共 50 条
  • [41] Multi-Robot Routing Problem with Min-Max Objective
    David, Jennifer
    Rognvaldsson, Thorsteinn
    ROBOTICS, 2021, 10 (04)
  • [42] A framework for multi-robot coverage analysis of large and complex structures
    Dai, Penglei
    Hassan, Mahdi
    Sun, Xuerong
    Zhang, Ming
    Bian, Zhengwei
    Liu, Dikai
    JOURNAL OF INTELLIGENT MANUFACTURING, 2022, 33 (05) : 1545 - 1560
  • [43] The Evolution of Cooperative Behaviours in Physically Heterogeneous Multi-robot Systems
    Yang, Jianhua
    Liu, Yabo
    Wu, Zhaohui
    Yao, Min
    INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS, 2012, 9
  • [44] Multi-Robot Coordination Analysis, Taxonomy, Challenges and Future Scope
    Verma, Janardan Kumar
    Ranga, Virender
    JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS, 2021, 102 (01)
  • [45] A self-adaptive network for multi-robot warehouse communication
    Varma, Ashwini Kumar
    Karjee, Jyotirmoy
    Mitra, Debjani
    Rath, Hemant Kumar
    Pal, Arpan
    COMPUTING, 2021, 103 (02) : 333 - 356
  • [46] Robust and Efficient Multi-Robot 3D Mapping with Octree Based Occupancy Grids
    Jessup, J.
    Givigi, S. N.
    Beaulieu, A.
    2014 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC), 2014, : 3996 - 4001
  • [47] A framework for multi-robot coverage analysis of large and complex structures
    Penglei Dai
    Mahdi Hassan
    Xuerong Sun
    Ming Zhang
    Zhengwei Bian
    Dikai Liu
    Journal of Intelligent Manufacturing, 2022, 33 : 1545 - 1560
  • [48] A Complete Algorithm for Generating Safe Trajectories for Multi-robot Teams
    Tang, Sarah
    Kumar, Vijay
    ROBOTICS RESEARCH, VOL 2, 2018, 3 : 599 - 616
  • [49] Coordinated Multi-Robot Exploration: Out of the Box Packages for ROS
    Andre, Torsten
    Neuhold, Daniel
    Bettstetter, Christian
    2014 GLOBECOM WORKSHOPS (GC WKSHPS), 2014, : 1457 - 1462
  • [50] A Dynamic Leader-Follower Strategy for Multi-Robot Systems
    Li, Feng
    Ding, Yongsheng
    Hao, Kuangrong
    2015 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC 2015): BIG DATA ANALYTICS FOR HUMAN-CENTRIC SYSTEMS, 2015, : 298 - 303