Learning-based needle tip tracking in 2D ultrasound by fusing visual tracking and motion prediction

被引:17
作者
Yan, Wanquan [1 ,2 ]
Ding, Qingpeng [1 ,2 ]
Chen, Jianghua [1 ,2 ]
Yan, Kim [1 ,2 ]
Tang, Raymond Shing-Yan [3 ,4 ]
Cheng, Shing Shin [1 ,2 ,5 ,6 ]
机构
[1] Chinese Univ Hong Kong, Dept Mech & Automat Engn, Hong Kong, Peoples R China
[2] Chinese Univ Hong Kong, T Stone Robot Inst, Hong Kong, Peoples R China
[3] Chinese Univ Hong Kong, Dept Med & Therapeut, Hong Kong, Peoples R China
[4] Chinese Univ Hong Kong, Inst Digest Dis, Hong Kong, Peoples R China
[5] Chinese Univ Hong Kong, Inst Med Intelligence & XR, Multiscale Med Robot Ctr, Hong Kong, Peoples R China
[6] Chinese Univ Hong Kong, Shun Hing Inst Adv Engn, Hong Kong, Peoples R China
关键词
Ultrasound imaging; Needle tracking; Motion prediction; Data fusion; Deep learning; REAL-TIME; INVISIBLE NEEDLE; BIOPSY; SEGMENTATION; LOCALIZATION; ALGORITHM; 2-D;
D O I
10.1016/j.media.2023.102847
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Visual trackers are the most commonly adopted approach for needle tip tracking in ultrasound (US)-based procedures. However, they often perform unsatisfactorily in biological tissues due to the significant background noise and anatomical occlusion. This paper presents a learning-based needle tip tracking system, which consists of not only a visual tracking module, but also a motion prediction module. In the visual tracking module, two sets of masks are designed to improve the tracker's discriminability, and a template update submodule is used to keep up to date with the needle tip's current appearance. In the motion prediction module, a Transformer network-based prediction architecture estimates the target's current position according to its historical position data to tackle the problem of target's temporary disappearance. A data fusion module then integrates the results from the visual tracking and motion prediction modules to provide robust and accurate tracking results. Our proposed tracking system showed distinct improvement against other state-of-the-art trackers during the motorized needle insertion experiments in both gelatin phantom and biological tissue environments (e.g. 78% against <60% in terms of the tracking success rate in the most challenging scenario of "In-plane-static"during the tissue experiments). Its robustness was also verified in manual needle insertion experiments under varying needle velocities and directions, and occasional temporary needle tip disappearance, with its tracking success rate being >18% higher than the second best performing tracking system. The proposed tracking system, with its computational efficiency, tracking robustness, and tracking accuracy, will lead to safer targeting during existing clinical practice of US-guided needle operations and potentially be integrated in a tissue biopsy robotic system.
引用
收藏
页数:14
相关论文
共 69 条
[1]   Experimental evaluation of ultrasound-guided 3D needle steering in biological tissue [J].
Abayazid, Momen ;
Vrooijink, Gustaaf J. ;
Patil, Sachin ;
Alterovitz, Ron ;
Misra, Sarthak .
INTERNATIONAL JOURNAL OF COMPUTER ASSISTED RADIOLOGY AND SURGERY, 2014, 9 (06) :931-939
[2]  
[Anonymous], 2012, Time Series Analysis by State Space Methods, DOI DOI 10.1093/ACPROF:OSO/9780199641178.001.0001
[3]   Optimal Reconstruction of Human Motion From Scarce Multimodal Data [J].
Averta, Giuseppe ;
Iuculano, Matilde ;
Salaris, Paolo ;
Bianchi, Matteo .
IEEE TRANSACTIONS ON HUMAN-MACHINE SYSTEMS, 2022, 52 (05) :833-842
[4]  
Ayvali E, 2014, IEEE INT CONF ROBOT, P5896, DOI 10.1109/ICRA.2014.6907727
[5]   CASPER: computer-aided segmentation of imperceptible motion-a learning-based tracking of an invisible needle in ultrasound [J].
Beigi, Parmida ;
Rohling, Robert ;
Salcudean, Septimiu E. ;
Ng, Gary C. .
INTERNATIONAL JOURNAL OF COMPUTER ASSISTED RADIOLOGY AND SURGERY, 2017, 12 (11) :1857-1866
[6]   Detection of an invisible needle in ultrasound using a probabilistic SVM and time-domain features [J].
Beigi, Parmida ;
Rohling, Robert ;
Salcudean, Tim ;
Lessoway, Victoria A. ;
Ng, Gary C. .
ULTRASONICS, 2017, 78 :18-22
[7]   Learning Discriminative Model Prediction for Tracking [J].
Bhat, Goutam ;
Danelljan, Martin ;
Van Gool, Luc ;
Timofte, Radu .
2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, :6181-6190
[8]  
Chatelain P, 2013, IEEE INT CONF ROBOT, P1676, DOI 10.1109/ICRA.2013.6630795
[9]   Revisiting RCNN: On Awakening the Classification Power of Faster RCNN [J].
Cheng, Bowen ;
Wei, Yunchao ;
Shi, Honghui ;
Feris, Rogerio ;
Xiong, Jinjun ;
Huang, Thomas .
COMPUTER VISION - ECCV 2018, PT 15, 2018, 11219 :473-490
[10]   ATOM: Accurate Tracking by Overlap Maximization [J].
Danelljan, Martin ;
Bhat, Goutam ;
Khan, Fahad Shahbaz ;
Felsberg, Michael .
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, :4655-4664