A comprehensive review of task understanding of command-triggered execution of tasks for service robots

被引:0
作者
Xiangming Xi
Shiqiang Zhu
机构
[1] Zhejiang Lab,Research Center for Intelligent Robotics
[2] Zhejiang Lab,undefined
来源
Artificial Intelligence Review | 2023年 / 56卷
关键词
Task understanding; Autonomous robots; Service robots; Command-triggered execution of tasks; Grounding; Human robot interaction;
D O I
暂无
中图分类号
学科分类号
摘要
Robotics is a cross-disciplinary branch of science and technology, and lays foundation on mechanics, control, computer science, artificial intelligence, and so on. With the developments of both softwares and hardwares, especially in the artificial intelligence technologies, robots have been widely applied in multiple areas in the society, and become more and more interactive in our daily life, such as the service robots in the museums, shopping malls, restaurants, etc. Though the ultimate goal for a service robot to behave like a human is not easy to be achieved, significant processes have been made during the past decades. Considering that it is universal that service robots are triggered to execute tasks specified by human users via commands (Comm-TET), and it is essential to process and understand human users’ commands correctly, we comprehensively overview the developments of the task understanding (TU) sub-process of Comm-TET for service robots. In order to organize the related literature in a reasonable manner, we abstracted the pipeline of Comm-TET and the generic framework of TU based on the existing researches. Following the abstracted framework, we present in-depth discussions on each of its building blocks over the past decades, and give some insights on the future research directions. Compared to other reviews on TU, this review emphasizes more on the technical developments and organizes the existing researches as an integrality.
引用
收藏
页码:7137 / 7193
页数:56
相关论文
共 262 条
  • [1] Afouras T(2018)Deep audio-visual speech recognition IEEE Trans Pattern Anal Mach Intell 88 103356-47
  • [2] Chung JS(2020)A tour-guide robot: moving towards interaction with humans Eng Appl Artif Intell 11 42-55
  • [3] Senior A(2007)A spoken language interface with a mobile robot Artif Life Robot 114 3-312
  • [4] Vinyals O(1999)Experiences with an interactive museum tour-guide robot Artif Intell 3 299-751
  • [5] Zisserman A(2011)Problems and issues for service robots in new applications Int J Soc Robot 48 744-117
  • [6] Alvarado Vásquez BPE(2020)Sampling from the complement of a polyhedron: an MCMC algorithm for data augmentation Oper Res Lett 1 100-1164
  • [7] Matía F(2013)Toward open knowledge enabling for human-robot interaction J Hum-Robot Interact 13 49-827
  • [8] Bos J(2021)Semantic task planning for service robots in open worlds Future Internet 35 1155-166
  • [9] Oka T(2013)Commanding a robotic wheelchair with a high-frequency steady-state visual evoked potential based brain-computer interface J Med Eng Phys 69 815-144
  • [10] Burgard W(2018)Service robot system with integration of wearable Myo armband for specialized hand gesture human-computer interfaces for people with disabilities with mobility problems Comput Electr Eng 42 143-358