Human Behavior Analysis in Human-Robot Cooperation with AR Glasses

被引:0
作者
Owaki, Koichi [1 ]
Techasarntiku, Nattaon [1 ]
Shimonishi, Hideyuki [1 ]
机构
[1] Osaka Univ, Suita, Osaka, Japan
来源
2023 IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY, ISMAR | 2023年
关键词
Human-centered computing-Human computer interaction (HCI)-HCI design and evaluation methods-User studies; NAVIGATION; COLLABORATION;
D O I
10.1109/ISMAR59233.2023.00016
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
To achieve efficient human-robot cooperation, it is necessary to work in close proximity while ensuring safety. However, in conventional robot control, maintaining a certain distance between humans and robots is required for safety, owing to control uncertainties and unexpected human actions, which can limit the efficiency of robot operations. Therefore, this study aims to establish a human-robot cooperation aiding system that concerns both safety and efficiency in a close proximity situation. We propose two Augmented Reality (AR) interfaces to display robot information via AR glasses, allowing workers to see the robot information while focusing on their task and avoiding collisions with the robot. AR glasses can give hands-free communication required for a work environment like warehouses or convenience store backyards, and multiple information levels, simple or informative, to balance accuracy and easiness of human recognition ability. We conducted a comparative evaluation experiments with 24 participants and found that both safety and efficiency were improved using the proposed user interfaces (UIs). We also collected the position, head motion, and eye-tracking data from the AR glasses to gain insight into human behavior during the tasks for each UI. Consequently, we clarified the behavior of the participants under each condition and how they contributed to safety and efficiency.
引用
收藏
页码:20 / 28
页数:9
相关论文
共 20 条
  • [1] Predicting human navigation goals based on Bayesian inference and activity regions
    Bruckschen, Lilli
    Bungert, Kira
    Dengler, Nils
    Bennewitz, Maren
    [J]. ROBOTICS AND AUTONOMOUS SYSTEMS, 2020, 134 (134)
  • [2] Mixed reality LVC simulation: A new approach to study pedestrian behaviour
    Chen, Minze
    Yang, Rui
    Tao, Zhenxiang
    Zhang, Ping
    [J]. BUILDING AND ENVIRONMENT, 2022, 207
  • [3] Augmented Reality for Human-Robot Collaboration and Cooperation in Industrial Applications: A Systematic Literature Review
    Costa, Gabriel de Moura
    Petry, Marcelo Roberto
    Moreira, Antonio Paulo
    [J]. SENSORS, 2022, 22 (07)
  • [4] De Luca A, 2012, P IEEE RAS-EMBS INT, P288, DOI 10.1109/BioRob.2012.6290917
  • [5] Flacco F, 2012, IEEE INT CONF ROBOT, P338, DOI 10.1109/ICRA.2012.6225245
  • [6] Huang CM, 2016, ACMIEEE INT CONF HUM, P83, DOI 10.1109/HRI.2016.7451737
  • [7] Inam R, 2018, IEEE INT C EMERG, P743, DOI 10.1109/ETFA.2018.8502466
  • [8] Human-Centered Robot Navigation-Towards a Harmoniously Human-Robot Coexisting Environment
    Lam, Chi-Pang
    Chou, Chen-Tun
    Chiang, Kuo-Hung
    Fu, Li-Chen
    [J]. IEEE TRANSACTIONS ON ROBOTICS, 2011, 27 (01) : 99 - 112
  • [9] Liu B., 2022, KN-J. Cartogr. Geogr. Inf, V72, P129, DOI [10.1007/s42489-022-00108-4, DOI 10.1007/S42489-022-00108-4]
  • [10] Nanri Hideyuki, 2021, Technical report.