Automatic detection of hand hygiene using computer vision technology

被引:42
作者
Singh, Amit [1 ]
Haque, Albert [2 ]
Alahi, Alexandre [3 ]
Yeung, Serena [4 ]
Guo, Michelle [2 ]
Glassman, Jill R. [5 ]
Beninati, William [6 ]
Platchek, Terry [1 ,5 ]
Li Fei-Fei [2 ]
Milstein, Arnold [5 ]
机构
[1] Stanford Univ, Dept Pediat, Sch Med, Stanford, CA 94305 USA
[2] Stanford Univ, Dept Comp Sci, Stanford, CA 94305 USA
[3] Ecole Polytech Fed Lausanne, Dept Civil Engn, Lausanne, Switzerland
[4] Stanford Univ, Dept Biomed Data Sci, Stanford, CA 94305 USA
[5] Stanford Univ, Sch Med, Clin Excellence Res Ctr, Stanford, CA USA
[6] Intermt TeleHlth Serv, Murray, UT USA
关键词
computer vision; hand hygiene; healthcare acquired infections; patient safety; machine learning; artificial intelligence; depth sensing;
D O I
10.1093/jamia/ocaa115
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Objective: Hand hygiene is essential for preventing hospital-acquired infections but is difficult to accurately track. The gold-standard (human auditors) is insufficient for assessing true overall compliance. Computer vision technology has the ability to perform more accurate appraisals. Our primary objective was to evaluate if a computer vision algorithm could accurately observe hand hygiene dispenser use in images captured by depth sensors. Materials and Methods: Sixteen depth sensors were installed on one hospital unit. Images were collected continuously from March to August 2017. Utilizing a convolutional neural network, a machine learning algorithm was trained to detect hand hygiene dispenser use in the images. The algorithm's accuracy was then compared with simultaneous in-person observations of hand hygiene dispenser usage. Concordance rate between human observation and algorithm's assessment was calculated. Ground truth was established by blinded annotation of the entire image set. Sensitivity and specificity were calculated for both human and machine-level observation. Results: A concordance rate of 96.8% was observed between human and algorithm (kappa = 0.85). Concordance among the 3 independent auditors to establish ground truth was 95.4% (Fleiss's kappa = 0.87). Sensitivity and specificity of the machine learning algorithm were 92.1% and 98.3%, respectively. Human observations showed sensitivity and specificity of 85.2% and 99.4%, respectively. Conclusions: A computer vision algorithm was equivalent to human observation in detecting hand hygiene dispenser use. Computer vision monitoring has the potential to provide a more complete appraisal of hand hygiene activity in hospitals than the current gold-standard given its ability for continuous coverage of a unit in space and time.
引用
收藏
页码:1316 / 1320
页数:5
相关论文
共 24 条
[1]   Using High-Technology to Enforce Low-Technology Safety Measures: The Use of Third-party Remote Video Auditing and Real-time Feedback in Healthcare [J].
Armellino, Donna ;
Hussain, Erfan ;
Schilling, Mary Ellen ;
Senicola, William ;
Eichorn, Ann ;
Dlugacz, Yosef ;
Farber, Bruce F. .
CLINICAL INFECTIOUS DISEASES, 2012, 54 (01) :1-7
[2]   The use of privacy-protected computer vision to measure the quality of healthcare worker hand hygiene [J].
Awwad, Sari ;
Tarvade, Sanjay ;
Piccardi, Massimo ;
Gattas, David J. .
INTERNATIONAL JOURNAL FOR QUALITY IN HEALTH CARE, 2019, 31 (01) :36-42
[3]   Diagnostic Assessment of Deep Learning Algorithms for Detection of Lymph Node Metastases in Women With Breast Cancer [J].
Bejnordi, Babak Ehteshami ;
Veta, Mitko ;
van Diest, Paul Johannes ;
van Ginneken, Bram ;
Karssemeijer, Nico ;
Litjens, Geert ;
van der Laak, Jeroen A. W. M. .
JAMA-JOURNAL OF THE AMERICAN MEDICAL ASSOCIATION, 2017, 318 (22) :2199-2210
[4]  
Centers for Disease Control and Prevention, Healthcare-Associated Infections Data Portal
[5]   A guide to deep learning in healthcare [J].
Esteva, Andre ;
Robicquet, Alexandre ;
Ramsundar, Bharath ;
Kuleshov, Volodymyr ;
DePristo, Mark ;
Chou, Katherine ;
Cui, Claire ;
Corrado, Greg ;
Thrun, Sebastian ;
Dean, Jeff .
NATURE MEDICINE, 2019, 25 (01) :24-29
[6]   Dermatologist-level classification of skin cancer with deep neural networks (vol 542, pg 115, 2017) [J].
Esteva, Andre ;
Kuprel, Brett ;
Novoa, Roberto A. ;
Ko, Justin ;
Swetter, Susan M. ;
Blau, Helen M. ;
Thrun, Sebastian .
NATURE, 2017, 546 (7660) :686-686
[7]  
Fleiss J. L., 2013, Statistical methods for rates and proportions
[8]   Development and Validation of a Deep Learning Algorithm for Detection of Diabetic Retinopathy in Retinal Fundus Photographs [J].
Gulshan, Varun ;
Peng, Lily ;
Coram, Marc ;
Stumpe, Martin C. ;
Wu, Derek ;
Narayanaswamy, Arunachalam ;
Venugopalan, Subhashini ;
Widner, Kasumi ;
Madams, Tom ;
Cuadros, Jorge ;
Kim, Ramasamy ;
Raman, Rajiv ;
Nelson, Philip C. ;
Mega, Jessica L. ;
Webster, R. .
JAMA-JOURNAL OF THE AMERICAN MEDICAL ASSOCIATION, 2016, 316 (22) :2402-2410
[9]  
Haque A, 2017, P MACH LEARN HEALHC, V68, P75
[10]   Densely Connected Convolutional Networks [J].
Huang, Gao ;
Liu, Zhuang ;
van der Maaten, Laurens ;
Weinberger, Kilian Q. .
30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, :2261-2269