Multilevel effective surgical workflow recognition in robotic left lateral sectionectomy with deep learning: experimental research

被引:0
作者
Liu, Yanzhe [1 ]
Zhao, Shang [2 ,3 ]
Zhang, Gong [1 ]
Zhang, Xiuping [1 ]
Hu, Minggen [1 ]
Zhang, Xuan [1 ]
Li, Chenggang [1 ]
Zhou, S. Kevin [2 ,3 ]
Liu, Rong [1 ,4 ]
机构
[1] Chinese Peoples Liberat Army Gen Hosp, Fac Hepatobiliary Pancreat Surg, Med Ctr 1, Med Sch Chinese Peoples Liberat Army PLA, Beijing, Peoples R China
[2] Univ Sci & Technol China, Ctr Med Imaging Robot Analyt Comp & Learning MIRAC, Sch Biomed Engn, Suzhou 215123, Peoples R China
[3] Univ Sci & Technol China, Suzhou Inst Adv Res, Ctr Med Imaging Robot Analyt Comp & Learning MIRAC, Suzhou 215123, Peoples R China
[4] Chinese Peoples Liberat Army PLA Gen Hosp, Fac Hepatobiliary Pancreat Surg, Beijing 100853, Peoples R China
关键词
artificial intelligence; deep learning; robotic left lateral sectionectomy; robotic surgery; surgical workflow recognition; ARTIFICIAL-INTELLIGENCE; DATA SCIENCE;
D O I
10.1097/JS9.0000000000000559
中图分类号
R61 [外科手术学];
学科分类号
摘要
Background: Automated surgical workflow recognition is the foundation for computational models of medical knowledge to interpret surgical procedures. The fine-grained segmentation of the surgical process and the improvement of the accuracy of surgical workflow recognition facilitate the realization of autonomous robotic surgery. This study aimed to construct a multigranularity temporal annotation dataset of the standardized robotic left lateral sectionectomy (RLLS) and develop a deep learning-based automated model for multilevel overall and effective surgical workflow recognition.Methods: From December 2016 to May 2019, 45 cases of RLLS videos were enrolled in our dataset. All frames of RLLS videos in this study are labeled with temporal annotations. The authors defined those activities that truly contribute to the surgery as effective frames, while other activities are labeled as under-effective frames. Effective frames of all RLLS videos are annotated with three hierarchical levels of 4 steps, 12 tasks, and 26 activities. A hybrid deep learning model were used for surgical workflow recognition of steps, tasks, activities, and under-effective frames. Moreover, the authors also carried out multilevel effective surgical workflow recognition after removing under-effective frames.Results: The dataset comprises 4 383 516 annotated RLLS video frames with multilevel annotation, of which 2 418 468 frames are effective. The overall accuracies of automated recognition for Steps, Tasks, Activities, and under-effective frames are 0.82, 0.80, 0.79, and 0.85, respectively, with corresponding precision values of 0.81, 0.76, 0.60, and 0.85. In multilevel effective surgical workflow recognition, the overall accuracies were increased to 0.96, 0.88, and 0.82 for Steps, Tasks, and Activities, respectively, while the precision values were increased to 0.95, 0.80, and 0.68.Conclusion: In this study, the authors created a dataset of 45 RLLS cases with multilevel annotations and developed a hybrid deep learning model for surgical workflow recognition. The authors demonstrated a fairly higher accuracy in multilevel effective surgical workflow recognition when under-effective frames were removed. Our research could be helpful in the development of autonomous robotic surgery.
引用
收藏
页码:2941 / 2952
页数:12
相关论文
共 46 条
[1]   The STROCSS statement: Strengthening the Reporting of Cohort Studies in Surgery [J].
Agha, Riaz Ahmed ;
Borrelli, Mimi R. ;
Vella-Baldacchino, Martinique ;
Thavayogan, Rachel ;
Orgill, Dennis P. .
INTERNATIONAL JOURNAL OF SURGERY, 2017, 46 :198-202
[2]   A Dataset and Benchmarks for Segmentation and Recognition of Gestures in Robotic Surgery [J].
Ahmidi, Narges ;
Tao, Lingling ;
Sefati, Shahin ;
Gao, Yixin ;
Lea, Colin ;
Haro, Benjamin Bejar ;
Zappella, Luca ;
Khudanpur, Sanjeev ;
Vidal, Rene ;
Hager, Gregory D. .
IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, 2017, 64 (09) :2025-2041
[3]   Use of Automated Performance Metrics to Measure Surgeon Performance during Robotic Vesicourethral Anastomosis and Methodical Development of a Training Tutorial [J].
Chen, Jian ;
Oh, Paul J. ;
Cheng, Nathan ;
Shah, Ankeet ;
Montez, Jeremy ;
Jarc, Anthony ;
Guo, Liheng ;
Gill, Inderbir S. ;
Hung, Andrew J. .
JOURNAL OF UROLOGY, 2018, 200 (04) :895-902
[4]   Artificial intelligence-based automated laparoscopic cholecystectomy surgical phase recognition and analysis [J].
Cheng, Ke ;
You, Jiaying ;
Wu, Shangdi ;
Chen, Zixin ;
Zhou, Zijian ;
Guan, Jingye ;
Peng, Bing ;
Wang, Xin .
SURGICAL ENDOSCOPY AND OTHER INTERVENTIONAL TECHNIQUES, 2022, 36 (05) :3160-3168
[5]   Comparative Short-term Benefits of Laparoscopic Liver Resection: 9000 Cases and Climbing [J].
Ciria, Ruben ;
Cherqui, Daniel ;
Geller, David A. ;
Briceno, Javier ;
Wakabayashi, Go .
ANNALS OF SURGERY, 2016, 263 (04) :761-777
[6]   OR2020 workshop overview: operating room of the future [J].
Cleary, K ;
Chung, HY ;
Mun, SK .
CARS 2004: COMPUTER ASSISTED RADIOLOGY AND SURGERY, PROCEEDINGS, 2004, 1268 :847-852
[7]  
Colleoni E, 2020, INT C MED IM COMP CO, P700, DOI 10.1007/978-3-030-59716-0_67
[8]   Ethical implications of AI in robotic surgical training: A Delphi consensus statement [J].
Collins, Justin W. ;
Marcus, Hani J. ;
Ghazi, Ahmed ;
Sridhar, Ashwin ;
Hashimoto, Daniel ;
Hager, Gregory ;
Arezzo, Alberto ;
Jannin, Pierre ;
Maier-Hein, Lena ;
Marz, Keno ;
Valdastri, Pietro ;
Mori, Kensaku ;
Elson, Daniel ;
Giannarou, Stamatia ;
Slack, Mark ;
Hares, Luke ;
Beaulieu, Yanick ;
Levy, Jeff ;
Laplante, Guy ;
Ramadorai, Arvind ;
Jarc, Anthony ;
Andrews, Ben ;
Garcia, Pablo ;
Neemuchwala, Huzefa ;
Andrusaite, Alina ;
Kimpe, Tom ;
Hawkes, David ;
Kelly, John D. ;
Stoyanov, Danail .
EUROPEAN UROLOGY FOCUS, 2022, 8 (02) :613-622
[9]  
Demir KC., 2022, PREPRINT, DOI [10.36227/techrxiv.19665717.v2, DOI 10.36227/TECHRXIV.19665717.V2]
[10]   Segmenting and classifying activities in robot-assisted surgery with recurrent neural networks [J].
DiPietro, Robert ;
Ahmidi, Narges ;
Malpani, Anand ;
Waldram, Madeleine ;
Lee, Gyusung, I ;
Lee, Mija R. ;
Vedula, S. Swaroop ;
Hager, Gregory D. .
INTERNATIONAL JOURNAL OF COMPUTER ASSISTED RADIOLOGY AND SURGERY, 2019, 14 (11) :2005-2020