Augmented Reality-Based Lung Ultrasound Scanning Guidance

被引:3
作者
Bimbraw, Keshav [1 ]
Ma, Xihan [1 ]
Zhang, Ziming [1 ]
Zhang, Haichong [1 ]
机构
[1] Worcester Polytech Inst, Worcester, MA 01609 USA
来源
MEDICAL ULTRASOUND, AND PRETERM, PERINATAL AND PAEDIATRIC IMAGE ANALYSIS, ASMUS 2020, PIPPI 2020 | 2020年 / 12437卷
关键词
Lung ultrasound; POCUS; COVID-19; Coronavirus; Augmented reality; Computer vision; Machine learning; Image processing;
D O I
10.1007/978-3-030-60334-2_11
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Lung ultrasound (LUS) is an established non-invasive imaging method for diagnosing respiratory illnesses. With the rise of SARS-CoV-2 (COVID-19) as a global pandemic, LUS has been used to detect pneumopathy for triaging and monitoring patients who are diagnosed or suspected with COVID-19 infection. While LUS offers a cost-effective, radiation-free, and higher portability compared with chest X-ray and CT, its accessibility is limited due to its user dependency and the small number of physicians and sonographers who can perform appropriate scanning and diagnosis. In this paper, we propose a framework of guiding LUS scanning featuring augmented reality, in which the LUS procedure can be guided by projecting the scanning trajectory on the patient's body. To develop such a system, we implement a computer vision-based detection algorithm to classify different regions on the human body. The DensePose algorithm is used to obtain body mesh data for the upper body pictured with a mono-camera. Torso submesh is used to extract and overlay the eight regions corresponding to anterior and lateral chests for LUS guidance. To minimize the instability of the DensePose mesh coordinates based on different frontal angles of the camera, a machine learning regression algorithm is applied to predict the angle-specific projection model for the chest. ArUco markers are utilized for training the ground truth chest regions to be scanned, and another single ArUco marker is used for detecting the center-line of the body. The augmented scanning regions are highlighted one by one to guide the scanning path to execute the LUS procedure. We demonstrate the feasibility of guiding the LUS scanning procedure through the combination of augmented reality, computer vision, and machine learning.
引用
收藏
页码:106 / 115
页数:10
相关论文
共 50 条
  • [31] A framework for virtual reality with tangible augmented reality-based user interface
    Hong, D
    Woo, W
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2006, E89D (01): : 45 - 52
  • [32] Augmented Reality-Based Audio/Visual Surveillance System
    Guennoun, Mouhcine
    Khattak, Saad
    Kapralos, Bill
    El-Khatib, Khalil
    2008 IEEE INTERNATIONAL WORKSHOP ON HAPTIC AUDIO VISUAL ENVIRONMENTS AND THEIR APPLICATIONS, 2008, : 70 - +
  • [33] Augmented Reality-Based Learning Systems: Personalisation Framework
    Kurilovas, Eugenijus
    Dvareckiene, Viktorija
    Jevsikova, Tatjana
    PROCEEDINGS OF THE 15TH EUROPEAN CONFERENCE ON E-LEARNING (ECEL 2016), 2016, : 391 - 398
  • [34] Assessing performance of augmented reality-based neurosurgical training
    Si, Wei-Xin
    Liao, Xiang-Yun
    Qian, Yin-Ling
    Sun, Hai-Tao
    Chen, Xiang-Dong
    Wang, Qiong
    Heng, Pheng Ann
    VISUAL COMPUTING FOR INDUSTRY BIOMEDICINE AND ART, 2019, 2 (01)
  • [35] An augmented reality-based method to assess precordial electrocardiogram leads: a feasibility trial
    Serfoezoe, Peter Daniel
    Sandkuehler, Robin
    Bluemke, Bibiana
    Matthisson, Emil
    Meier, Jana
    Odermatt, Jolein
    Badertscher, Patrick
    Sticherling, Christian
    Strebel, Ivo
    Cattin, Philippe C.
    Eckstein, Jens
    EUROPEAN HEART JOURNAL - DIGITAL HEALTH, 2023, 4 (05): : 420 - 427
  • [36] Augmented reality-based guidance in product assembly and maintenance/repair perspective: A state of the art review on challenges and opportunities
    Eswaran, M.
    Gulivindala, Anil Kumar
    Inkulu, Anil Kumar
    Bahubalendruni, M. V. A. Raju
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 213
  • [37] The application of augmented reality-based navigation for accurate target acquisition of deep brain sites: advances in neurosurgical guidance
    Gibby, Wendell
    Cvetko, Steve
    Gibby, Andrew
    Gibby, Conrad
    Sorensen, Kiel
    Andrews, Edward G.
    Maroon, Joseph
    Parr, Ryan
    JOURNAL OF NEUROSURGERY, 2022, 137 (02) : 489 - 495
  • [38] STUDENT ATTITUDE TOWARD AUGMENTED REALITY-BASED DESIGN COLLABORATION
    Ko, Chih-Hsiang
    Chang, Ting-Chia
    PROCEEDINGS OF THE IADIS INTERNATIONAL CONFERENCE E-LEARNING 2012, 2012, : 214 - 220
  • [39] User Requirements Analysis on Augmented Reality-Based Maintenance in Manufacturing
    Runji, Joel Murithi
    Lee, Yun-Ju
    Chu, Chih-Hsing
    JOURNAL OF COMPUTING AND INFORMATION SCIENCE IN ENGINEERING, 2022, 22 (05)
  • [40] Augmented Reality-Based Rehabilitation of Gait Impairments: Case Report
    Held, Jeremia Philipp Oskar
    Yu, Kevin
    Pyles, Conner
    Veerbeek, Janne Marieke
    Bork, Felix
    Heining, Sandro-Michael
    Navab, Nassir
    Luft, Andreas Ruediger
    JMIR MHEALTH AND UHEALTH, 2020, 8 (05):