Automatic grading of human blastocysts from time-lapse imaging

被引:86
作者
Kragh, Mikkel F. [1 ,2 ]
Rimestad, Jens [2 ]
Berntsen, Jorgen [2 ]
Karstoft, Henrik [1 ]
机构
[1] Aarhus Univ, Deparment Engn, Aarhus, Denmark
[2] Vitrolife AS, Viby, Denmark
关键词
Time-lapse imaging; Automated blastocyst grading; Inner cell mass; Trophectoderm; Ordinal regression; EMBRYO; IDENTIFICATION; CLASSIFICATION; STAGE;
D O I
10.1016/j.compbiomed.2019.103494
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
Background: Blastocyst morphology is a predictive marker for implantation success of in vitro fertilized human embryos. Morphology grading is therefore commonly used to select the embryo with the highest implantation potential. One of the challenges, however, is that morphology grading can be highly subjective when performed manually by embryologists. Grading systems generally discretize a continuous scale of low to high score, resulting in floating and unclear boundaries between grading categories. Manual annotations therefore suffer from large inter-and intra-observer variances. Method: In this paper, we propose a method based on deep learning to automatically grade the morphological appearance of human blastocysts from time-lapse imaging. A convolutional neural network is trained to jointly predict inner cell mass (ICM) and trophectoderm (TE) grades from a single image frame, and a recurrent neural network is applied on top to incorporate temporal information of the expanding blastocysts from multiple frames. Results: Results showed that the method achieved above human-level accuracies when evaluated on majority votes from an independent test set labeled by multiple embryologists. Furthermore, when evaluating implantation rates for embryos grouped by morphology grades, human embryologists and our method had a similar correlation between predicted embryo quality and pregnancy outcome. Conclusions: The proposed method has shown improved performance of predicting ICM and TE grades on human blastocysts when utilizing temporal information available with time-lapse imaging. The algorithm is considered at least on par with human embryologists on quality estimation, as it performed better than the average human embryologist at ICM and TE prediction and provided a slightly better correlation between predicted embryo quality and implantability than human embryologists.
引用
收藏
页数:10
相关论文
共 46 条
[1]   Morphology vs morphokinetics: a retrospective comparison of inter-observer and intra-observer agreement between embryologists on blastocysts with known implantation outcome [J].
Adolfsson, Emma ;
Andershed, Anna Nowosad .
JORNAL BRASILEIRO DE REPRODUCAO ASSISTIDA, 2018, 22 (03) :228-237
[2]  
[Anonymous], 2001, PROC EUR C MACH LEAR
[3]  
[Anonymous], 2017, ARXIV170605098 CORR
[4]  
[Anonymous], TECHNICAL REPORT
[5]  
[Anonymous], PREDICTING EMBRYO MO
[6]   Ordinal Regression Methods: Survey and Experimental Study [J].
Antonio Gutierrez, Pedro ;
Perez-Ortiz, Maria ;
Sanchez-Monedero, Javier ;
Fernandez-Navarro, Francisco ;
Hervas-Martinez, Cesar .
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2016, 28 (01) :127-146
[7]  
Arsa DMS, 2016, INT C ADV COMP SCI I, P446, DOI 10.1109/ICACSIS.2016.7872751
[8]   Istanbul consensus workshop on embryo assessment: proceedings of an expert meeting [J].
Balaban, Basak ;
Brison, Daniel ;
Calderon, Gloria ;
Catt, James ;
Conaghan, Joe ;
Cowan, Lisa ;
Ebner, Thomas ;
Gardner, David ;
Hardarson, Thorir ;
Lundin, Kersti ;
Magli, M. Cristina ;
Mortimer, David ;
Mortimer, Sharon ;
Munne, Santiago ;
Royere, Dominique ;
Scott, Lynette ;
Smitz, Johan ;
Thornhill, Alan ;
van Blerkom, Jonathan ;
Van den Abbeel, Etienne .
REPRODUCTIVE BIOMEDICINE ONLINE, 2011, 22 (06) :632-646
[9]   Interobserver and intraobserver variation in day 3 embryo grading [J].
Bendus, Allison E. Baxter ;
Mayer, Jacob F. ;
Shipley, Sharon K. ;
Catherino, William H. .
FERTILITY AND STERILITY, 2006, 86 (06) :1608-1615
[10]   Xception: Deep Learning with Depthwise Separable Convolutions [J].
Chollet, Francois .
30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, :1800-1807