Automated segmentation of epithelial tissue in prostatectomy slides using deep learning

被引:8
作者
Bulten, Wouter [1 ,2 ]
Hulsbergen-van de Kaa, Christina A. [2 ]
van der Laak, Jeroen [1 ,2 ]
Litjens, Geert J. S. [1 ,2 ]
机构
[1] Radboud Univ Nijmegen, Med Ctr, Diagnost Image Anal Grp, Nijmegen, Netherlands
[2] Radboud Univ Nijmegen, Med Ctr, Dept Pathol, Nijmegen, Netherlands
来源
MEDICAL IMAGING 2018: DIGITAL PATHOLOGY | 2018年 / 10581卷
关键词
histopathology; whole-slide imaging; deep learning; segmentation; prostate cancer;
D O I
10.1117/12.2292872
中图分类号
O43 [光学];
学科分类号
070207 ; 0803 ;
摘要
Prostate cancer is generally graded by pathologists based on hematoxylin and eosin (H&E) stained slides. Because of the large size of the tumor areas in radical prostatectomies (RP), this task can be tedious and error prone with known high interobserver variability. Recent advancements in deep learning have enabled development of automated systems that may assist pathologists in prostate diagnostics. As prostate cancer originates from glandular tissue, an important prerequisite for development of such algorithms is the possibility to automatically differentiate between glandular tissue and other tissues. In this paper, we propose a method for automatically segmenting epithelial tissue in digitally scanned prostatectomy slides based on deep learning. We collected 30 single-center whole mount tissue sections, with reported Gleason growth patterns ranging from 3 to 5, from 27 patients that underwent RP. Two different network architectures, U-Net and regular fully convolutional networks with varying depths, were trained using a set of sparsely annotated slides. We evaluated the trained networks on exhaustively annotated regions from a separate test set. The test set contained both healthy and cancerous epithelium with different Gleason growth patterns. The results show the effectiveness of our approach given a pixel-based AUC score of 0.97. Our method contains no prior assumptions on glandular morphology, does not directly rely on the presence of lumina, and all features are learned by the network itself. The generated segmentation can be used to highlight regions of interest for pathologists and to improve cancer annotations to further enhance an automatic cancer grading system.
引用
收藏
页数:7
相关论文
共 10 条
[1]  
[Anonymous], 2015, LASAGNE 1 RELEASE
[2]  
[Anonymous], 2016, ABS160502688 ARXIV T
[3]   A Contemporary Update on Pathology Reporting for Prostate Cancer: Biopsy and Radical Prostatectomy Specimens [J].
Fine, Samson W. ;
Amin, Mahul B. ;
Berney, Daniel M. ;
Bjartell, Anders ;
Egevad, Lars ;
Epstein, Jonathan I. ;
Humphrey, Peter A. ;
Magi-Galluzzi, Christina ;
Montironi, Rodolfo ;
Stief, Christian .
EUROPEAN UROLOGY, 2012, 62 (01) :20-39
[4]   Machine learning approaches to analyze histological images of tissues from radical prostatectomies [J].
Gertych, Arkadiusz ;
Ing, Nathan ;
Ma, Zhaoxuan ;
Fuchs, Thomas J. ;
Salman, Sadri ;
Mohanty, Sambit ;
Bhele, Sanica ;
Velasquez-Vacca, Adriana ;
Amin, Mahul B. ;
Knudsen, Beatrice S. .
COMPUTERIZED MEDICAL IMAGING AND GRAPHICS, 2015, 46 :197-208
[5]  
Long J, 2015, PROC CVPR IEEE, P3431, DOI 10.1109/CVPR.2015.7298965
[6]  
Naik S., 2007, MIAAB workshop, P1
[7]   Global cancer statistics, 2002 [J].
Parkin, DM ;
Bray, F ;
Ferlay, J ;
Pisani, P .
CA-A CANCER JOURNAL FOR CLINICIANS, 2005, 55 (02) :74-108
[8]   U-Net: Convolutional Networks for Biomedical Image Segmentation [J].
Ronneberger, Olaf ;
Fischer, Philipp ;
Brox, Thomas .
MEDICAL IMAGE COMPUTING AND COMPUTER-ASSISTED INTERVENTION, PT III, 2015, 9351 :234-241
[9]   Gland segmentation in prostate histopathological images [J].
Singh, Malay ;
Kalaw, Emarene Mationg ;
Giron, Danilo Medina ;
Chong, Kian-Tai ;
Tan, Chew Lim ;
Lee, Hwee Kuan .
Journal of Medical Imaging, 2017, 4 (02)
[10]   A Deep Convolutional Neural Network for segmenting and classifying epithelial and stromal regions in histopathological images [J].
Xu, Jun ;
Luo, Xiaofei ;
Wang, Guanhao ;
Gilmore, Hannah ;
Madabhushi, Anant .
NEUROCOMPUTING, 2016, 191 :214-223