Efficient Grading of Prostate Cancer WSI with Deep Learning

被引:0
作者
Bhattacharyya, Riddhasree [1 ]
Roy, Paromita [2 ]
Banerji, Sugata [3 ]
Mitra, Sushmita [1 ]
机构
[1] Indian Stat Inst, Machine Intelligence Unit, Kolkata 700108, W Bengal, India
[2] Tata Med Ctr, Dept Pathol, Kolkata 700160, W Bengal, India
[3] Lake Forest Coll, Dept Math & Comp Sci, Lake Forest, IL 60045 USA
来源
DIGITAL AND COMPUTATIONAL PATHOLOGY, MEDICAL IMAGING 2024 | 2024年 / 12933卷
关键词
histopathology; prostate cancer; ISUP grading; WSI; computer vision; CNN;
D O I
10.1117/12.3006835
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The field of histopathology, which involves visual examination of tissue samples at a microscopic scale, is very important for the diagnosis of cancer. Although this task is currently performed by human experts, the design of computer vision-based systems to assist human experts is an interesting research area. This problem is ideal for the application of computer-based image analysis; especially, with the great success of convolutional neural networks (CNNs) in image segmentation and classification in the last decade. However, applying CNNs to this problem is challenging for a number of reasons, such as excessive high resolution (involving huge computational burden), variations in sample processing, and insufficient annotation. In this current work, we propose a CNN-based approach to tackle the problem of prostate cancer grading from Whole Slide Images (WSIs). We use a patch-based, multi-step training algorithm to address the challenges of large image size, tissue sample variations and partial annotation. Then we propose two novel classification strategies using an ensemble of CNN models to classify tissue slide images into different ISUP grades (1 - 5). We demonstrate the efficacy of our method on the publicly available large scale Prostate cANcer graDe Assessment (PANDA) Challenge dataset. The effectiveness of the technique is measured using Cohen's quadratic kappa score. The results are shown to be highly accurate (kappa score of 0.88) and better than other leading state-of-the art methods.
引用
收藏
页数:7
相关论文
共 24 条
[1]  
[Anonymous], PROSTATE CANC GRADE
[2]   Artificial intelligence for diagnosis and Gleason grading of prostate cancer: the PANDA challenge [J].
Bulten, Wouter ;
Kartasalo, Kimmo ;
Chen, Po-Hsuan Cameron ;
Strom, Peter ;
Pinckaers, Hans ;
Nagpal, Kunal ;
Cai, Yuannan ;
Steiner, David F. ;
van Boven, Hester ;
Vink, Robert ;
Hulsbergen-van de Kaa, Christina ;
van der Laak, Jeroen ;
Amin, Mahul B. ;
Evans, Andrew J. ;
van der Kwast, Theodorus ;
Allan, Robert ;
Humphrey, Peter A. ;
Gronberg, Henrik ;
Samaratunga, Hemamali ;
Delahunt, Brett ;
Tsuzuki, Toyonori ;
Hakkinen, Tomi ;
Egevad, Lars ;
Demkin, Maggie ;
Dane, Sohier ;
Tan, Fraser ;
Valkonen, Masi ;
Corrado, Greg S. ;
Peng, Lily ;
Mermel, Craig H. ;
Ruusuvuori, Pekka ;
Litjens, Geert ;
Eklund, Martin .
NATURE MEDICINE, 2022, 28 (01) :154-+
[3]   Automated deep-learning system for Gleason grading of prostate cancer using biopsies: a diagnostic study [J].
Bulten, Wouter ;
Pinckaers, Hans ;
van Boven, Hester ;
Vink, Robert ;
de Bel, Thomas ;
van Ginneken, Bram ;
van der Laak, Jeroen ;
Hulsbergen-van de Kaa, Christina ;
Litjens, Geert .
LANCET ONCOLOGY, 2020, 21 (02) :233-241
[4]   Interobserver reproducibility of Gleason grading:: evaluation using prostate cancer tissue microarrays [J].
Burchardt, M. ;
Engers, R. ;
Mueller, M. ;
Burchardt, T. ;
Willers, R. ;
Epstein, J. I. ;
Ackermann, R. ;
Gabbert, H. E. ;
de la Taille, A. ;
Rubin, M. A. .
JOURNAL OF CANCER RESEARCH AND CLINICAL ONCOLOGY, 2008, 134 (10) :1071-1078
[5]  
Campanella G, 2018, Arxiv, DOI arXiv:1805.06983
[6]   XGBoost: A Scalable Tree Boosting System [J].
Chen, Tianqi ;
Guestrin, Carlos .
KDD'16: PROCEEDINGS OF THE 22ND ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2016, :785-794
[7]  
Epstein Jonathan I, 2015, Ann Pathol, V35, P474, DOI 10.1016/j.annpat.2015.09.002
[8]  
Hu J, 2018, PROC CVPR IEEE, P7132, DOI [10.1109/TPAMI.2019.2913372, 10.1109/CVPR.2018.00745]
[9]  
Ke G., 2017, ADV NEURAL INFORM PR, V30, P3149, DOI DOI 10.5555/3294996.3295074
[10]  
Li JY, 2019, Arxiv, DOI arXiv:1905.13208