Vision-Based Suture Tensile Force Estimation in Robotic Surgery

被引:19
|
作者
Jung, Won-Jo [1 ]
Kwak, Kyung-Soo [1 ]
Lim, Soo-Chul [1 ]
机构
[1] Dongguk Univ, Dept Mech Robot & Energy Engn, 30 Pildong Ro 1gil, Seoul 04620, South Korea
基金
新加坡国家研究基金会;
关键词
force estimation; interaction force; neural networks; machine learning; minimally invasive surgery; suture tensile force; FEEDBACK; DEFORMATION;
D O I
10.3390/s21010110
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Compared to laparoscopy, robotics-assisted minimally invasive surgery has the problem of an absence of force feedback, which is important to prevent a breakage of the suture. To overcome this problem, surgeons infer the suture force from their proprioception and 2D image by comparing them to the training experience. Based on this idea, a deep-learning-based method using a single image and robot position to estimate the tensile force of the sutures without a force sensor is proposed. A neural network structure with a modified Inception Resnet-V2 and Long Short Term Memory (LSTM) networks is used to estimate the suture pulling force. The feasibility of proposed network is verified using the generated DB, recording the interaction under the condition of two different artificial skins and two different situations (in vivo and in vitro) at 13 viewing angles of the images by changing the tool positions collected from the master-slave robotic system. From the evaluation conducted to show the feasibility of the interaction force estimation, the proposed learning models successfully estimated the tensile force at 10 unseen viewing angles during training.
引用
收藏
页码:1 / 13
页数:13
相关论文
共 50 条
  • [21] Visual Measurement of Suture Strain for Robotic Surgery
    Martell, John
    Elmer, Thomas
    Gopalsami, Nachappa
    Park, Young Soo
    COMPUTATIONAL AND MATHEMATICAL METHODS IN MEDICINE, 2011, 2011
  • [22] Robust Slippage Degree Estimation Based on Reference Update of Vision-Based Tactile Sensor
    Ito, Yuji
    Kim, Youngwoo
    Obinata, Goro
    IEEE SENSORS JOURNAL, 2011, 11 (09) : 2037 - 2047
  • [23] Incipient Slip Detection Method With Vision-Based Tactile Sensor Based on Distribution Force and Deformation
    Sui, Ruomin
    Zhang, Lunwei
    Li, Tiemin
    Jiang, Yao
    IEEE SENSORS JOURNAL, 2021, 21 (22) : 25973 - 25985
  • [24] A recurrent convolutional neural network approach for sensorless force estimation in robotic surgery
    Marban, Arturo
    Srinivasan, Vignesh
    Samek, Wojciech
    Fernandez, Josep
    Casals, Alicia
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2019, 50 : 134 - 150
  • [25] V-ANFIS for Dealing with Visual Uncertainty for Force Estimation in Robotic Surgery
    Aviles, Angelica I.
    Alsaleh, Samar M.
    Montseny, Eduard
    Casals, Alicia
    PROCEEDINGS OF THE 2015 CONFERENCE OF THE INTERNATIONAL FUZZY SYSTEMS ASSOCIATION AND THE EUROPEAN SOCIETY FOR FUZZY LOGIC AND TECHNOLOGY, 2015, 89 : 1465 - 1472
  • [26] Vision-based terrain learning
    Karlsen, Robert E.
    Witus, Gary
    UNMANNED SYSTEMS TECHNOLOGY VIII, PTS 1 AND 2, 2006, 6230
  • [27] An estimation method for vision-based autonomous landing system for fixed wing aircraft
    Ryu, Hyunjee
    Lim, Junyoung
    Lee, Hongju
    Moon, Gunhee
    Kim, Kyunam
    INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2024,
  • [28] Model-based Grasp Force Estimation for Minimally Invasive Surgery
    Xin, Xiaoxiao
    Chen, Xu
    Zhao, Baoliang
    Gao, Peng
    Hu, Ying
    Liu, Shoubin
    Lv, Guoqing
    2017 IEEE INTERNATIONAL CONFERENCE ON INFORMATION AND AUTOMATION (IEEE ICIA 2017), 2017, : 489 - 493
  • [29] Vision-Based Fast Parameter Identification in Cell Membrane Force Model for Microinjection Micro-force Sensing
    Chen, Zhong
    Gao, Xinyi
    Liang, Zengsheng
    Zhang, Xianmin
    7TH INTERNATIONAL CONFERENCE ON MANIPULATION, AUTOMATION, AND ROBOTICS AT SMALL SCALES, MARSS 2024, 2024, : 37 - +
  • [30] DNN-Based Force Estimation in Hyper-Redundant Manipulators
    Choi, Sunwoong
    Moon, Yonghwan
    Kim, Jeongryul
    Kim, Keri
    INTERNATIONAL JOURNAL OF PRECISION ENGINEERING AND MANUFACTURING, 2024, 25 (10) : 2111 - 2123