Selecting and Commanding Groups in a Multi-Robot Vision Based System

被引:9
|
作者
Milligan, Brian [1 ]
Mori, Greg [1 ]
Vaughan, Richard [1 ]
机构
[1] Simon Fraser Univ, Sch Comp Sci, Burnaby, BC, Canada
来源
PROCEEDINGS OF THE 6TH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTIONS (HRI 2011) | 2011年
关键词
Computer Vision; Multiple Robots; Selection; Pointing; Task allocation and coordination; User Feedback;
D O I
10.1145/1957656.1957809
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
We present a novel method for a human user to select groups of robots without using any external instruments. We use computer vision techniques to read hand gestures from a user and use the hand gesture information to select single or multiple robots from a population and assign them to a task. To select robots the user simply draws a circle in the air around the robots that the user wants to command. Once the user selects the group of robots, he or she can send them to a location by pointing to a target location. To achieve this we use cameras mounted on mobile robots to find the user's face and then track his or her hand. Our method exploits an observation from human-robot interaction on pointing, which found a human's target when pointing is best inferred using the line from the human's eyes to the user's extended hand[1]. When circling robots the projected eye-to-hand lines forms a cone-like shape that envelops the selected robots. From a 2D camera mounted on the robot, this cone is seen with the user's face as the vertex and the hand movements as a circular slice of the cone. We show in the video how the robots can tell if they have been selected by testing to see if the face is within the circle made by the hand. If the face is within the circle then the robot was selected, if the face is outside the circle it was not selected. Following selection the robots then read a command by looking for a pointing gesture, which is detected by an outreached hand. From the pointing gesture the robots collectively infer which target is pointing at by calculating the distance and direction that the hand moved to relative to the face. The selected robots then travel to the target, and unselected robots can then be selected and commanded as desired. The robots communicate their state to the user through LED lights on the robots chassis. When a robot is searching for the user's face the LEDs flash to get the user's attention (as it is easiest to find frontal faces). When the robots find the users face the lights become a solid yellow to indicate that they are ready to be selected. When selected, the robots' LEDs turn blue to indicate they can now be commanded. Once robots are sent off to a location, remaining robots can then be selected and assigned another task. We demonstrate this method working on low powered Atom Netbooks and off the shelf USB web cameras. This shows the first working implementation of a system that allows a human to select and command groups of robots with out using any external instruments.
引用
收藏
页码:415 / 415
页数:1
相关论文
共 50 条
  • [1] A Vision Based Multi-robot Cooperative Semantic SLAM Algorithm
    Peng, Ji
    Li, Xiaoqiang
    Wei, Gao
    Ming, Li
    2022 34TH CHINESE CONTROL AND DECISION CONFERENCE, CCDC, 2022, : 5663 - 5668
  • [2] Computer Vision-based Algae Removal Planner for Multi-robot Teams
    Penmetcha, Manoj
    Luo, Shaocheng
    Samantaray, Arabinda
    Dietz, J. Eric
    Yang, Baijian
    Min, Byung-Cheol
    2019 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC), 2019, : 1575 - 1581
  • [3] Computer Vision System for Multi-Robot Construction Waste Management: Integrating Cloud and Edge Computing
    Wang, Zeli
    Yang, Xincong
    Zheng, Xianghan
    Huang, Daoyin
    Jiang, Binfei
    Buildings, 2024, 14 (12)
  • [4] Multi-camera multi-robot visual localization system
    Magiera, Artur Morys
    Dlugosz, Marek
    Skruch, Pawel
    2024 28TH INTERNATIONAL CONFERENCE ON METHODS AND MODELS IN AUTOMATION AND ROBOTICS, MMAR 2024, 2024, : 375 - 380
  • [5] Efficient Multi-Robot Cooperative Transportation Scheduling System
    Li, Xiaodong
    Lin, Yangfei
    Du, Zhaoyang
    Yin, Rui
    Wu, Celimuge
    2024 INTERNATIONAL CONFERENCE ON UBIQUITOUS COMMUNICATION, UCOM 2024, 2024, : 449 - 454
  • [6] Autonomous Manufacturing of Composite Parts by a Multi-Robot System
    Schuster, Alfons
    Kupke, Michael
    Larsen, Lars
    27TH INTERNATIONAL CONFERENCE ON FLEXIBLE AUTOMATION AND INTELLIGENT MANUFACTURING, FAIM2017, 2017, 11 : 249 - 255
  • [7] Decentralized Vision-Based Byzantine Agent Detection in Multi-robot Systems with IOTA Smart Contracts
    Salimpour, Sahar
    Keramat, Farhad
    Queralta, Jorge Pena
    Westerlund, Tomi
    FOUNDATIONS AND PRACTICE OF SECURITY, FPS 2022, 2023, 13877 : 322 - 337
  • [8] Formation constrained multi-robot system in unknown environments
    Cao, ZQ
    Xie, LJ
    Zhang, B
    Wang, S
    Tan, M
    2003 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-3, PROCEEDINGS, 2003, : 735 - 740
  • [9] On the Performance of Multi-Robot Wireless-Based Networks
    Nguyen, Van Son
    Chien, Trinh Van
    Hoa, Dang Khanh
    Hung, Do Dinh
    Nguyen, Hoai Giang
    Le, Chi Quynh
    Tu, Lam-Thanh
    RADIOENGINEERING, 2024, 33 (01) : 127 - 135
  • [10] Multi-Robot Cooperative Hunting
    Shen, He
    Li, Ni
    Rojas, Salvador
    Zhang, Lanchun
    2016 INTERNATIONAL CONFERENCE ON COLLABORATION TECHNOLOGIES AND SYSTEMS (CTS), 2016, : 349 - 353