Towards Skill Transfer via Learning-Based Guidance in Human-Robot Interaction: An Application to Orthopaedic Surgical Drilling Skill

被引:0
作者
Ehsan Zahedi
Fariba Khosravian
Weiqi Wang
Mehran Armand
Javad Dargahi
Mehrdad Zadeh
机构
[1] Concordia University,Department of Mechanical, Industrial, Aerospace Engineering
[2] Johns Hopkins University,undefined
来源
Journal of Intelligent & Robotic Systems | 2020年 / 98卷
关键词
Human-robot interaction; Machine learning-based guidance; Virtual surgical simulation; 68T40; 93C85;
D O I
暂无
中图分类号
学科分类号
摘要
This paper presents a machine learning-based guidance (LbG) approach for kinesthetic human-robot interaction (HRI) that can be used in virtual training simulations. Demonstrated positional and force skills are learned to both discriminate the skill levels of users and produce LbG forces. Force information is obtained from virtual forces, which developed based on real computed tomography (CT) data, rather than force sensors. A femur bone drilling simulation is developed to provide a practice environment for orthopaedic residents. The residents are provided with haptic feedback that enable them to feel the variable stiffness of bone layers. The X-ray views of the bone are also presented to them for better tracking of a pre-defined path inside the bone. The simulation is capable of planning a drill path, generating X-rays based on user defined orientation, and recording motion data for user assessment and skill modeling. The knowledge of expert surgeons is also incorporated into the simulation to provide LbG forces for improving the unpredictable motions of the residents. To discriminate the skill level of users, machine learning tools are used to develop surgical expert and resident models. In addition, to improve residents performance, the expert HCRF is used to generate adaptive LbG forces regarding the similarities between residents motions and the expert model. Experimental results show that the learning-based approach is able to assess the skill of users and improve residents performance.
引用
收藏
页码:667 / 678
页数:11
相关论文
共 100 条
  • [11] Coles T(2016)Learning physical collaborative robot behaviors from human demonstrations IEEE Trans. Robot. 32 513-undefined
  • [12] Meglan D(2014)Learning compliant manipulation through kinesthetic and tactile human-robot interaction IEEE Trans. Haptic 7 367-undefined
  • [13] John N(2004)Leveraging on a virtual environment for robot programming by demonstration Robot. Auton. Syst. 47 153-undefined
  • [14] Seymour N(2005)Evaluation of virtual fixtures for a robot programming by demonstration interface IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 35 536-undefined
  • [15] Zahedi E(2011)Imitation learning of positional and force skills demonstrated via kinesthetic teaching and haptic input Adv. Robot. 25 581-undefined
  • [16] Dargahi J(2015)Synthesizing anticipatory haptic assistance considering human behavior uncertainty IEEE Trans. Robot. 31 180-undefined
  • [17] Kia M(2006)Hidden conditional random fields for gesture recognition 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition 2 1521-undefined
  • [18] Zadeh M(2016)Dynamic hand gesture recognition with leap motion controller IEEE Signal Process Lett. 23 1188-undefined
  • [19] Rosen J(2015)Coupled hidden conditional random fields for rgb-d human action recognition Signal Process. 112 74-undefined
  • [20] Brown J(2006)Stability of haptic rendering: Discretization, quantization, time delay, and coulomb effects IEEE Trans. Robot. 22 256-undefined