Interactive multi-modal suturing

被引:13
|
作者
Payandeh, Shahram [1 ]
Shi, Fuhan [1 ]
机构
[1] Simon Fraser Univ, Expt Robot & Graph Lab, Burnaby, BC V5A 1S6, Canada
关键词
Virtual suturing; Suture model; Wound closure; Tissue tearing; Haptic feedback; Surgical training environment; Serious games; SURGERY SIMULATION;
D O I
10.1007/s10055-010-0174-6
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
We present a mechanics-based interactive multi-modal environment designed as part of a serious gaming platform. The specific objectives are to teach basic suturing and knotting techniques for simple skin or soft tissue wound closure. The pre-wound suturing target, skin, or deformable tissue is modeled as a modified mass-spring system. The suturing material is designed as a mechanics-based deformable linear object. Tools involved in a typical suturing procedures are also simulated. Collision management modules between the soft tissue and the needle, the soft tissue and the suture are analyzed. In addition to modeling the interactive environment of a typical suturing procedure, basics of the modeling approaches on the evaluation of a stitch formed by the user are also discussed. For example, if needle insertion points are too close from each other or to the edge of the wound, when the suture is pulled, the suture will tear the soft tissue instead of suturing the incision together. Experiment results show that our simulator can run on a standard personal computer and allow users to perform different suturing patterns with smooth graphics and haptic feedback.
引用
收藏
页码:241 / 253
页数:13
相关论文
共 50 条
  • [31] Toward an Interactive Mobility Assistant for Multi-modal Transport in Smart Cities
    Kuster, Christian
    Masuch, Nils
    Sivrikaya, Fikret
    SERVICE-ORIENTED COMPUTING - ICSOC 2017 WORKSHOPS, 2018, 10797 : 322 - 328
  • [32] Exploring Interactive Teaching of a Multi-Modal Emotional Expression of a Humanoid Robot
    Anis, Myriam
    Elnaggar, Ahmed
    Reichardt, Dirk
    PROCEEDINGS OF 2016 FUTURE TECHNOLOGIES CONFERENCE (FTC), 2016, : 908 - 915
  • [33] Multi-modal interactive fusion method for detecting teenagers' psychological stress
    Zhang, Huijun
    Cao, Lei
    Feng, Ling
    Yang, Mo
    JOURNAL OF BIOMEDICAL INFORMATICS, 2020, 106
  • [34] Graph Interactive Network with Adaptive Gradient for Multi-Modal Rumor Detection
    Sun, Tiening
    Qian, Zhong
    Li, Peifeng
    Zhu, Qiaoming
    PROCEEDINGS OF THE 2023 ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA RETRIEVAL, ICMR 2023, 2023, : 316 - 324
  • [35] INTERACTIVE MULTI-MODAL VISUALIZATION ENVIRONMENT FOR COMPLEX SYSTEM DECISION MAKING
    Foo, Jung Leng
    Winer, Eliot
    DETC 2008: PROCEEDINGS OF THE ASME INTERNATIONAL DESIGN ENGINEERING TECHNICAL CONFERENCES AND COMPUTERS AND INFORMATIONAL IN ENGINEERING CONFERENCE, VOL 3, PTS A AND B: 28TH COMPUTERS AND INFORMATION IN ENGINEERING CONFERENCE, 2009, : 1661 - 1669
  • [36] Multi-modal information integration by conceptual fuzzy set for interactive systems
    Takagi, T
    Yamaguchi, T
    Sato, M
    1998 IEEE INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS AT THE IEEE WORLD CONGRESS ON COMPUTATIONAL INTELLIGENCE - PROCEEDINGS, VOL 1-2, 1998, : 738 - 743
  • [37] Interactive Multi-Modal Motion Planning With Branch Model Predictive Control
    Chen, Yuxiao
    Rosolia, Ugo
    Ubellacker, Wyatt
    Csomay-Shanklin, Noel
    Ames, Aaron D.
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2022, 7 (02) : 5365 - 5372
  • [38] Multi-modal Feedback for Affordance-driven Interactive Reinforcement Learning
    Cruz, Francisco
    Parisi, German, I
    Wermter, Stefan
    2018 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2018,
  • [39] Multi-modal Probabilistic Prediction of Interactive Behavior via an Interpretable Model
    Hu, Yeping
    Zhan, Wei
    Sun, Liting
    Tomizuka, Masayoshi
    2019 30TH IEEE INTELLIGENT VEHICLES SYMPOSIUM (IV19), 2019, : 557 - 563
  • [40] An Interactive Video Search Platform for Multi-modal Retrieval with Advanced Concepts
    Nguyen-Khang Le
    Dieu-Hien Nguyen
    Minh-Triet Tran
    MULTIMEDIA MODELING (MMM 2020), PT II, 2020, 11962 : 766 - 771