Accessible Human-Robot Interaction for Telepresence Robots: A Case Study

被引:1
作者
University of Massachusetts Lowell, Yale University, United States [1 ]
不详 [2 ]
机构
[1] University of Massachusetts Lowell, Yale University
来源
Paladyn | / 1卷
关键词
accessible user interface; assistive robotics; augmented reality; computer-mediated communication; embodied video conferencing; remote presence; social telepresence robot; teleoperation;
D O I
10.1515/pjbr-2015-0001
中图分类号
学科分类号
摘要
The quality of life of people with special needs, such as residents of healthcare facilities, may be improved through operating social telepresence robots that provide the ability to participate in remote activities with friends or family. However, to date, such platforms do not exist for this population. Methodology: Our research utilized an iterative, bottomup, user-centered approach, drawing upon our assistive robotics experiences. Based on the findings of our formative user studies, we developed an augmented reality user interface for our social telepresence robot. Our user interface focuses primarily on the human-human interaction and communication through video, providing support for semi-autonomous navigation. We conducted a case study (n=4) with our target population in which the robot was used to visit a remote art gallery. Results: All of the participants were able to operate the robot to explore the gallery, form opinions about the exhibits, and engage in conversation. Significance: This case study demonstrates that people from our target population can successfully engage in the active role of operating a telepresence robot. © 2015 Katherine M. Tsui et al.
引用
收藏
相关论文
共 50 条
[31]   How Can Physiological Computing Benefit Human-Robot Interaction? [J].
Roy, Raphaelle N. ;
Drougard, Nicolas ;
Gateau, Thibault ;
Dehais, Frederic ;
Chanel, Caroline P. C. .
ROBOTICS, 2020, 9 (04) :1-24
[32]   Individual Differences in Human-Robot Interaction in a Military Multitasking Environment [J].
Chen, Jessie Y. C. .
JOURNAL OF COGNITIVE ENGINEERING AND DECISION MAKING, 2011, 5 (01) :83-105
[33]   An Augmented Reality Interface for Safer Human-Robot Interaction in Manufacturing [J].
Rybalskii, Igor ;
Kruusamae, Karl ;
Singh, Arun Kumar ;
Schlund, Sebastian .
IFAC PAPERSONLINE, 2024, 58 (19) :581-585
[34]   Skill Learning for Human-Robot Interaction Using Wearable Device [J].
Fang, Bin ;
Wei, Xiang ;
Sun, Fuchun ;
Huang, Haiming ;
Yu, Yuanlong ;
Liu, Huaping .
TSINGHUA SCIENCE AND TECHNOLOGY, 2019, 24 (06) :654-662
[35]   Intuitive Adaptive Orientation Control for Enhanced Human-Robot Interaction [J].
Campeau-Lecours, Alexandre ;
Cote-Allard, Ulysse ;
Dinh-Son Vu ;
Routhier, Francois ;
Gosselin, Benoit ;
Gosselin, Clement .
IEEE TRANSACTIONS ON ROBOTICS, 2019, 35 (02) :509-520
[36]   Managing workload in human-robot interaction: A review of empirical studies [J].
Prewett, Matthew S. ;
Johnson, Ryan C. ;
Saboe, Kristin N. ;
Elliott, Linda R. ;
Coovert, Michael D. .
COMPUTERS IN HUMAN BEHAVIOR, 2010, 26 (05) :840-856
[37]   Human-Robot Interaction Based on Gaze Gestures for the Drone Teleoperation [J].
Yu, Mingxin ;
Lin, Yingzi ;
Schmidt, David ;
Wang, Xiangzhou ;
Wang, Yu .
JOURNAL OF EYE MOVEMENT RESEARCH, 2014, 7 (04)
[38]   Assimilation Control of a Robotic Exoskeleton for Physical Human-Robot Interaction [J].
Li, Guoxin ;
Li, Zhijun ;
Kan, Zhen .
IEEE ROBOTICS AND AUTOMATION LETTERS, 2022, 7 (02) :2977-2984
[39]   Exploiting ability for human adaptation to facilitate improved human-robot interaction and acceptance [J].
Caleb-Solly, Praminda ;
Dogramadzi, Sanja ;
Huijnen, Claire A. G. J. ;
van den Heuvel, Herjan .
INFORMATION SOCIETY, 2018, 34 (03) :153-165
[40]   Flexible Assimilation of Human's Target for Versatile Human-Robot Physical Interaction [J].
Takagi, Atsushi ;
Li, Yanan ;
Burdet, Etienne .
IEEE TRANSACTIONS ON HAPTICS, 2021, 14 (02) :421-431