Needle Segmentation Using GAN: Restoring Thin Instrument Visibility in Robotic Ultrasound

被引:0
|
作者
Jiang, Zhongliang [1 ]
Li, Xuesong [1 ]
Chu, Xiangyu [2 ,3 ]
Karlas, Angelos [4 ,5 ]
Bi, Yuan [1 ]
Cheng, Yingsheng [6 ]
Samuel Au, K. W. [2 ,3 ]
Navab, Nassir [1 ]
机构
[1] Tech Univ Munich, Chair Comp Aided Med Procedures & Augmented Real C, D-85748 Garching, Germany
[2] Chinese Univ Hong Kong, Dept Mech & Automat Engn, Hong Kong, Peoples R China
[3] Multiscale Med Robot Ctr, Hong Kong, Peoples R China
[4] Tech Univ Munich, Klinikum rechts Isar, Dept Vasc & Endovasc Surg, D-80333 Munich, Germany
[5] Tech Univ Munich, German Ctr Cardiovasc Res DZHK, D-80333 Munich, Germany
[6] Tongji Univ, Shanghai Pulm Hosp, Dept Med Imaging, Sch Med, Shanghai 200433, Peoples R China
关键词
Needles; Image segmentation; Three-dimensional displays; Generators; Training; Probes; Feature extraction; Ultrasonic imaging; Biomedical imaging; Real-time systems; Medical robotics; needle segmentation; robotic ultrasound (US); ultrasound segmentation; ARTIFACTS;
D O I
10.1109/TIM.2024.3451569
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Ultrasound-guided percutaneous needle insertion is a standard procedure employed in both biopsy and ablation in clinical practices. However, due to the complex interaction between tissue and instrument, the needle may deviate from the in-plane view, resulting in a lack of close monitoring of the percutaneous needle. To address this challenge, we introduce a robot-assisted ultrasound (US) imaging system designed to seamlessly monitor the insertion process and autonomously restore the visibility of the inserted instrument when misalignment happens. To this end, the adversarial structure is presented to encourage the generation of segmentation masks that align consistently with the ground truth in high-order space. This study also systematically investigates the effects on segmentation performance by exploring various training loss functions and their combinations. When misalignment between the probe and the percutaneous needle is detected, the robot is triggered to perform transverse searching to optimize the positional and rotational adjustment to restore needle visibility. The experimental results on ex-vivo porcine samples demonstrate that the proposed method can precisely segment the percutaneous needle (with a tip error of 0.37 +/- 0.29 mm and an angle error of 1.19 +/- 0.29 degrees). Furthermore, the needle appearance can be successfully restored under the repositioned probe pose in all 45 trials, with repositioning errors of 1.51 +/- 0.95 mm and 1.25 +/- 0.79 degrees.
引用
收藏
页数:11
相关论文
共 5 条
  • [1] SAMSurg: Surgical Instrument Segmentation in Robotic Surgeries Using Vision Foundation Model
    Matasyoh, Nevin M.
    Mathis-Ullrich, Franziska
    Zeineldin, Ramy A.
    IEEE ACCESS, 2024, 12 : 193950 - 193959
  • [2] Pixel-Wise Contrastive Learning for Multi-Class Instrument Segmentation in Endoscopic Robotic Surgery Videos Using Dataset-Wide Sample Queues
    Sun, Liping
    Chen, Xiong
    IEEE ACCESS, 2024, 12 : 156867 - 156877
  • [3] An Automatic Biopsy Needle Detection and Segmentation on Ultrasound Images Using a Convolutional Neural Network
    Wijata, Agata
    Andrzejewski, Jacek
    Pycinski, Bartlomiej
    ULTRASONIC IMAGING, 2021, 43 (05) : 262 - 272
  • [4] Automatic needle segmentation in 3D ultrasound images using 3D Hough transform
    Zhou, Hua
    Qiu, Wu
    Ding, Mingyue
    Zhang, Songgeng
    MIPPR 2007: MEDICAL IMAGING, PARALLEL PROCESSING OF IMAGES, AND OPTIMIZATION TECHNIQUES, 2007, 6789
  • [5] Automatic needle segmentation in 3D ultrasound images using 3D improved hough transform
    Zhou, Hua
    Qiu, Wu
    Ding, Mingyue
    Zhang, Songgen
    MEDICAL IMAGING 2008: VISUALIZATION, IMAGE-GUIDED PROCEDURES, AND MODELING, PTS 1 AND 2, 2008, 6918