Investigating the Role of Multi-modal Social Cues in Human-Robot Collaboration in Industrial Settings

被引:0
作者
Hoang-Long Cao
Constantin Scholz
Joris De Winter
Ilias El Makrini
Bram Vanderborght
机构
[1] Vrije Universiteit Brussel,BruBotics
[2] Flanders Make,undefined
[3] imec,undefined
来源
International Journal of Social Robotics | 2023年 / 15卷
关键词
Collaborative robots; Multi-modal social cues; Godspeed; Acceptance;
D O I
暂无
中图分类号
学科分类号
摘要
Expressing social cues through different communication channels plays an important role in mutual understanding, in both human-human and human-robot collaborations. A few studies investigated the effects of zoomorphic and anthropomorphic social cues expressed by industrial robot arms on robot-to-human communication. In this work, we investigate the role of multi-modal social cues by combining the robot’s head-like gestures with light and sound modalities in two studies. The first study found that multi-modal social cues have positive effects on people’s perception of the robot, perceived enjoyment, and intention to use. The second study found that a combination of human-like gestures with light and/or sound modalities could lead to a higher understandability of the robot’s social cues. These findings suggest the use of multi-modal social cues for robots in industrial settings. However, the possible negative impacts when implementing these social cues should be considered e.g. overtrust, and distraction.
引用
收藏
页码:1169 / 1179
页数:10
相关论文
共 55 条
  • [1] Baraka K(2018)Mobile service robot state revealing through expressive lights: formalism, design, and evaluation Int J Soc Robot 10 65-92
  • [2] Veloso MM(2009)Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots Int J Soc Robot 1 71-81
  • [3] Bartneck C(2012)Noise and communication: a three-year update Noise Health 14 281-346
  • [4] Kulić D(2018)A personalized and platform-independent behavior control system for social robots in therapy: development and applications IEEE Trans Cogn Dev Syst 11 334-58
  • [5] Croft E(2018)Working with walt: how a cobot was developed and inserted on an auto assembly line IEEE Robot Autom Mag 25 51-276
  • [6] Brammer AJ(2021)Enhancing emotional support: the effect of a robotic object on human-human support quality Int J Soc Robot 14 257-1012
  • [7] Laroche C(2022)Human preferences for robot eye gaze in human-to-robot handovers Int J Soc Robot 14 995-191
  • [8] Cao HL(2007)G* power 3: a flexible statistical power analysis program for the social, behavioral, and biomedical sciences Behav Res Methods 39 175-289
  • [9] Van de Perre G(2019)Why collaborative robots must be social (and even emotional) actors Techné Res Philos Technol 23 270-375
  • [10] Kennedy J(2021)Intuitive spatial tactile feedback for better awareness about robot trajectory during human-robot collaboration Sensors 21 5748-2185