Non-Facial and Non-Verbal Affective Expression for Appearance-Constrained Robots Used in Victim Management

被引:0
作者
Bethel C.L. [1 ]
Murphy R.R. [2 ]
机构
[1] Department of Computer Science, Yale University, New Haven, 06511, CT
[2] Center for Robot-Assisted Search and Rescue, Texas A and M University, College Station, 77843, TX
来源
Paladyn | 2010年 / 1卷 / 04期
基金
美国国家科学基金会;
关键词
Affective Expression; Appearance-Constrained Robots; Design Guidelines; Human-Robot Interaction; Non-Verbal Communication;
D O I
10.2478/s13230-011-0009-5
中图分类号
学科分类号
摘要
Non-facial and non-verbal methods of affective expression are essential for social interaction in appearance-constrained robots such as those used in search and rescue, law enforcement, and military applications. This research identified five main methods of non-facial and non-verbal affective expression (body movements, postures, orientation, color, and sound). Based on an extensive review of literature, prescriptive design recommendations were developed for the appropriate non-facial and non-verbal affective expression methods for three proximity zones of interest (intimate, personal, and social). These design recommendations serve as guidelines to add retroactively affective expression through software with minimal or no physical modification to a robot. A large-scale, complex human-robot interaction study was conducted to validate these design recommendations using 128 participants and four methods of evaluation. The study was conducted in a high-fidelity, confined-space simulated disaster site with all robot interactions performed in the dark. Statistically significant results indicated that participants felt the robots that exhibited affective expressions were more calming, friendly, and attentive, which improved the social human-robot interactions. © Cindy L. Bethel et al. 2010.
引用
收藏
页码:219 / 230
页数:11
相关论文
共 43 条
  • [1] Breazeal C.L., Designing Sociable Robots. Intelligent Robots and Autonomous Agents, (2002)
  • [2] Canamero L.D., Fredslund J., How Does It Feel? Emotional Interaction with a Humanoid Lego Robot, AAAI 2000 Fall Symposium-Socially Intelligent Agents: The Human in the Loop, Vol. Technical Report FS-00-04, pp. 23-28, (2000)
  • [3] Fong T., Nourbakhsh I., Dautenhahn K., A Survey of Socially Interactive Robots, Robotics and Autonomous Systems, 42, 3-4, (2003)
  • [4] Kirby R., Forlizzi J., Simmons R., Affective Social Robots, Robotics and Autonomous Systems, 58, pp. 322-332, (2010)
  • [5] Mizoguchi H., Sato T., Takagi K., Nakao M., Hatamura Y., Realization of Expressive Mobile Robot, 1997 IEEE International Conference on Robotics and Automation, 1, pp. 581-586, (1997)
  • [6] Scheeff M., Pinto J., Rahardja K., Snibbe S., Tow R., Experiences with Sparky, a Social Robot, Proceedings of the Workshop on Interactive Robot Entertainment, (2000)
  • [7] Bethel C.L., Murphy R.R., Survey of Non-Facial/Non-Verbal Affective Expressions for Appearance-Constrained Robots, IEEE Transactions on Systems, Man, Cybernetics, Part C, 38, 1, (2008)
  • [8] Argyle M., Bodily Communication, (1975)
  • [9] Bull P., Posture and Gesture, (1987)
  • [10] Bethel C.L., Murphy R.R., Affective Expression in Appearance-Constrained Robots, 1st ACM SIGCHI/SIGART Conference on Human-Robot Interaction (HRI2006, pp. 327-328, (2006)