Crowdsourced Assessment of Ureteroscopy with Laser Lithotripsy Video Feed Does Not Correlate with Trainee Experience

被引:9
作者
Conti, Simon L. [1 ,2 ,3 ]
Brubaker, William [1 ]
Chung, Benjamin I. [1 ]
Sofer, Mario [4 ]
Hsi, Ryan S. [5 ]
Shinghal, Rajesh [6 ]
Elliott, Christopher S. [1 ,7 ]
Caruso, Thomas [3 ,8 ]
Leppert, John T. [1 ,2 ]
机构
[1] Stanford Univ, Sch Med, Dept Urol, Stanford, CA 94305 USA
[2] Vet Affairs Palo Alto Hlth Care Syst, Palo Alto, CA USA
[3] Johns Hopkins Sch Educ, Baltimore, MD USA
[4] Tel Aviv Sourasky Med Ctr, Tel Aviv, Israel
[5] Vanderbilt Univ, Med Ctr, Dept Urol Surg, Nashville, TN USA
[6] Palo Alto Med Fdn, Palo Alto, CA USA
[7] Santa Clara Valley Med Ctr, Div Urol, Santa Clara, CA USA
[8] Stanford Univ, Sch Med, Dept Anesthesia, Stanford, CA 94305 USA
关键词
assessment; kidney stones; ureteroscopy; laser lithotripsy; crowd sourcing; CROWD-SOURCED ASSESSMENT; TECHNICAL SKILL; ASSESSMENT-TOOL;
D O I
10.1089/end.2018.0534
中图分类号
R5 [内科学]; R69 [泌尿科学(泌尿生殖系疾病)];
学科分类号
1002 ; 100201 ;
摘要
Objectives: We sought to validate the use of crowdsourced surgical video assessment in the evaluation of urology residents performing flexible ureteroscopic laser lithotripsy. Methods: We collected video feeds from 30 intrarenal ureteroscopic laser lithotripsy cases where residents, postgraduate year (PGY) two through six, handled the ureteroscope. The video feeds were annotated to represent overall performance and to contain parts of the procedure being scored. Videos were submitted to a commercially available surgical video evaluation platform (Crowd-Sourced Assessment of Technical Skills). We used a validated ureteroscopic laser lithotripsy global assessment tool that was modified to include only those domains that could be evaluated on the captured video. Videos were evaluated by crowd workers recruited using Amazon's Mechanical Turk platform as well as five endourology-trained experts. Mean scores were calculated and intraclass correlation coefficients (ICCs) were computed for the expert domain and total scores. ICCs were estimated using a linear mixed-effects model. Spearman rank correlation coefficients were calculated as a measure of the strength of the relationships between the crowd mean and expert average scores. Results: A total of 30 videos were reviewed 2488 times by 487 crowd workers and five expert endourologists. ICCs between expert raters were all below accepted levels of correlation (0.30), with the overall score having an ICC of <0.001. For individual domains, the crowd scores did not correlate with expert scores, except for the stone retrieval domain (0.60 p=0.015). In addition, crowdsourced scores had a negative correlation with the PGY level (0.44, p=0.019). Conclusions: There is poor agreement between experts and poor correlation between expert and crowd scores when evaluating video feeds of ureteroscopic laser lithotripsy. The use of an intraoperative video of ureteroscopy with laser lithotripsy for assessment of resident trainee skills does not appear reliable. This is further supported by the lack of correlation between crowd scores and advancing PGY level.
引用
收藏
页码:42 / 49
页数:8
相关论文
共 10 条
[1]  
Coburn Michael, 2013, J Grad Med Educ, V5, P79, DOI 10.4300/JGME-05-01s1-07
[2]   Worldwide Trends of Urinary Stone Disease Treatment Over the Last Two Decades: A Systematic Review [J].
Geraghty, Robert M. ;
Jones, Patrick ;
Somani, Bhaskar K. .
JOURNAL OF ENDOUROLOGY, 2017, 31 (06) :547-556
[3]   Global Evaluative Assessment of Robotic Skills: Validation of a Clinical Assessment Tool to Measure Robotic Surgical Skills [J].
Goh, Alvin C. ;
Goldfarb, David W. ;
Sander, James C. ;
Miles, Brian J. ;
Dunkin, Brian J. .
JOURNAL OF UROLOGY, 2012, 187 (01) :247-252
[4]  
Goldenberg M, 2018, J UROLOGY, V199, pE1137
[5]   Crowd-Sourced Assessment of Technical Skills: An Adjunct to Urology Resident Surgical Simulation Training [J].
Holst, Daniel ;
Kowalewski, Timothy M. ;
White, Lee W. ;
Brand, Timothy C. ;
Harper, Jonathan D. ;
Sorenson, Mathew D. ;
Kirsch, Sarah ;
Lendvay, Thomas S. .
JOURNAL OF ENDOUROLOGY, 2015, 29 (05) :604-609
[6]  
Martin JA, 1997, BRIT J SURG, V84, P273, DOI 10.1002/bjs.1800840237
[7]   A novel approach to endourological training: Training at the surgical skills center [J].
Matsumoto, ED ;
Hamstra, SJ ;
Radomski, SB ;
Cusimano, MD .
JOURNAL OF UROLOGY, 2001, 166 (04) :1261-1266
[8]  
Messick S., 1995, Educational Measurement: Issues and Practice, V14, P5, DOI 10.1111/j.17453992.1995.tb00881.x
[9]   A global assessment tool for evaluation of intraoperative laparoscopic skills [J].
Vassiliou, MC ;
Feldman, LS ;
Andrew, CG ;
Bergman, S ;
Leffondré, K ;
Stanbridge, D ;
Fried, GM .
AMERICAN JOURNAL OF SURGERY, 2005, 190 (01) :107-113
[10]   Crowd-Sourced Assessment of Technical Skill: A Valid Method for Discriminating Basic Robotic Surgery Skills [J].
White, Lee W. ;
Kowalewski, Timothy M. ;
Dockter, Rodney Lee ;
Comstock, Bryan ;
Hannaford, Blake ;
Lendvay, Thomas S. .
JOURNAL OF ENDOUROLOGY, 2015, 29 (11) :1295-1301