Intention estimation from gaze and motion features for human-robot shared-control object manipulation

被引:9
作者
Belardinelli, Anna [1 ]
Kondapally, Anirudh Reddy [2 ]
Ruiken, Dirk [1 ]
Tanneberg, Daniel [1 ]
Watabe, Tomoki [2 ]
机构
[1] Honda Res Inst EU, Offenbach, Germany
[2] Honda Res & Dev Co Ltd, Innovat Res Excellence, Wako, Saitama, Japan
来源
2022 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS) | 2022年
关键词
EYE-HAND COORDINATION; AUTONOMY CONTROL; RECOGNITION; SYSTEMS;
D O I
10.1109/IROS47612.2022.9982249
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Shared control can help in teleoperated object manipulation by assisting with the execution of the user's intention. To this end, robust and prompt intention estimation is needed, which relies on behavioral observations. Here, an intention estimation framework is presented, which uses natural gaze and motion features to predict the current action and the target object. The system is trained and tested in a simulated environment with pick and place sequences produced in a relatively cluttered scene and with both hands, with possible hand-over to the other hand. Validation is conducted across different users and hands, achieving good accuracy and earliness of prediction. An analysis of the predictive power of single features shows the predominance of the grasping trigger and the gaze features in the early identification of the current action. In the current framework, the same probabilistic model can be used for the two hands working in parallel and independently, while a rule-based model is proposed to identify the resulting bimanual action. Finally, limitations and perspectives of this approach to more complex, full-bimanual manipulations are discussed.
引用
收藏
页码:9806 / 9813
页数:8
相关论文
共 32 条
[1]   Motion intention recognition in robot assisted applications [J].
Aarno, Daniel ;
Kragic, Danica .
ROBOTICS AND AUTONOMOUS SYSTEMS, 2008, 56 (08) :692-705
[2]   Inferring Goals with Gaze during Teleoperated Manipulation [J].
Aronson, Reuben M. ;
Almutlak, Nadia ;
Admoni, Henny .
2021 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2021, :7307-7314
[3]  
Artigas J, 2016, IEEE INT CONF ROBOT, P1166, DOI 10.1109/ICRA.2016.7487246
[4]   It's in the eyes: Planning precise manual actions before execution [J].
Belardinelli, Anna ;
Stepper, Madeleine Y. ;
Butz, Martin V. .
JOURNAL OF VISION, 2016, 16 (01) :1-18
[5]   Emerging Cellular Therapies for Glioblastoma Multiforme [J].
Choi, Paul J. ;
Tubbs, R. Shane ;
Oskouian, Rod J. .
CUREUS, 2018, 10 (03)
[6]   A policy-blending formalism for shared control [J].
Dragan, Anca D. ;
Srinivasa, Siddhartha S. .
INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2013, 32 (07) :790-805
[7]   Exploiting Three-Dimensional Gaze Tracking for Action Recognition During Bimanual Manipulation to Enhance Human-Robot Collaboration [J].
Fathaliyan, Alireza Haji ;
Wang, Xiaoyu ;
Santos, Veronica J. .
FRONTIERS IN ROBOTICS AND AI, 2018, 5
[8]  
Fuchs S., 2021, Frontiers in Neurorobotics, V15, P33
[9]   Feedback strategies for telemanipulation with shared control of object handling forces [J].
Griffin, WB ;
Provancher, WR ;
Cutkosky, MR .
PRESENCE-TELEOPERATORS AND VIRTUAL ENVIRONMENTS, 2005, 14 (06) :720-731
[10]   Recognition, prediction, and planning for assisted teleoperation of freeform tasks [J].
Hauser, Kris .
AUTONOMOUS ROBOTS, 2013, 35 (04) :241-254