Online Identification of Interaction Behaviors From Haptic Data During Collaborative Object Transfer

被引:6
作者
Kucukyilmaz, Ayse [1 ]
Issak, Illimar [1 ]
机构
[1] Univ Lincoln, Sch Comp Sci, Lincoln LN6 7TS, England
来源
IEEE ROBOTICS AND AUTOMATION LETTERS | 2020年 / 5卷 / 01期
基金
英国工程与自然科学研究理事会;
关键词
Classification; Feature Extraction; Force and Tactile Sensing; Haptics and Haptic Interfaces; Human Factors and Human-in-the-Loop; Learning and Adaptive Systems; Physical Human-Human Interaction; Physical Human-Robot Interaction; Recognition;
D O I
10.1109/LRA.2019.2945261
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Joint object transfer is a complex task, which is less structured and less specific than what is existing in several industrial settings. When two humans are involved in such a task, they cooperate through different modalities to understand the interaction states during operation and mutually adapt to one another's actions. Mutual adaptation implies that both partners can identify how well they collaborate (i.e. infer about the interaction state) and act accordingly. These interaction states can define whether the partners work in harmony, face conflicts, or remain passive during interaction. Understanding how two humans work together during physical interactions is important when exploring the ways a robotic assistant should operate under similar settings. This study acts as a first step to implement an automatic classification mechanism during ongoing collaboration to identify the interaction state during object co-manipulation. The classification is done on a dataset consisting of data from 40 subjects, who are partnered to form 20 dyads. The dyads experiment in a physical human-human interaction (pHHI) scenario to move an object in an haptics-enabled virtual environment to reach predefined goal configurations. In this study, we propose a sliding-window approach for feature extraction and demonstrate the online classification methodology to identify interaction patterns. We evaluate our approach using 1) a support vector machine classifier (SVMc) and 2) a Gaussian Process classifier (GPc) for multi-class classification, and achieve over 80% accuracy with both classifiers when identifying general interaction types.
引用
收藏
页码:96 / 102
页数:7
相关论文
共 27 条
[1]   Haptic shared control: smoothly shifting control authority? [J].
Abbink, David A. ;
Mulder, Mark ;
Boer, Erwin R. .
COGNITION TECHNOLOGY & WORK, 2012, 14 (01) :19-28
[2]  
[Anonymous], 2001, GEN FEATURE EXTRACTI
[3]  
Bussy A, 2012, IEEE INT C INT ROBOT, P3633, DOI 10.1109/IROS.2012.6385921
[4]   Sliding Autonomy for Peer-To-Peer Human-Robot Teams [J].
Dias, M. Bernardine ;
Kannan, Balajee ;
Browning, Brett ;
Jones, E. Gil ;
Argall, Brenna ;
Dias, M. Freddie ;
Zinck, Marc ;
Veloso, Manuela M. ;
Stentz, Anthony J. .
IAS-10: INTELLIGENT AUTONOMOUS SYSTEMS 10, 2008, :332-341
[5]   A policy-blending formalism for shared control [J].
Dragan, Anca D. ;
Srinivasa, Siddhartha S. .
INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2013, 32 (07) :790-805
[6]  
Groten Raphaela, 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication, P723, DOI 10.1109/ROMAN.2009.5326315
[7]   A Framework to Describe, Analyze and Generate Interactive Motor Behaviors [J].
Jarrasse, Nathanael ;
Charalambous, Themistoklis ;
Burdet, Etienne .
PLOS ONE, 2012, 7 (11)
[8]  
Kucukyilmaz A, 2012, IMMERSIVE MULTIMODAL, P229, DOI 10.1007/978-1-4471-2754-3_13
[9]  
Kucukyilmaz A, 2013, IEEE T HAPTICS, V6, P58, DOI [10.1109/TOH.2012.21, 10.1109/ToH.2012.21]
[10]   Continuous Role Adaptation for Human-Robot Shared Control [J].
Li, Yanan ;
Tee, Keng Peng ;
Chan, Wei Liang ;
Yan, Rui ;
Chua, Yuanwei ;
Limbu, Dilip Kumar .
IEEE TRANSACTIONS ON ROBOTICS, 2015, 31 (03) :672-681