Multiple supervised residual network for osteosarcoma segmentation in CT images

被引:73
作者
Zhang, Rui [1 ]
Huang, Lin [1 ,2 ]
Xia, Wei [1 ]
Zhang, Bo [3 ]
Qiu, Bensheng [2 ]
Gao, Xin [1 ]
机构
[1] Chinese Acad Sci, Suzhou Inst Biomed Engn & Technol, Suzhou 215163, Jiangsu, Peoples R China
[2] Univ Sci & Technol China, Hefei, Anhui, Peoples R China
[3] Soochow Univ, Affiliated Hosp 2, Suzhou, Peoples R China
基金
中国国家自然科学基金;
关键词
Osteosarcoma segmentation; Deep residual network; Multiple supervised networks; CONVOLUTIONAL NEURAL-NETWORKS; BRAIN-TUMOR SEGMENTATION; CHEMOTHERAPY; SURVIVAL; MRI;
D O I
10.1016/j.compmedimag.2018.01.006
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Automatic and accurate segmentation of osteosarcoma region in CT images can help doctor make a reasonable treatment plan, thus improving cure rate. In this paper, a multiple supervised residual network (MSRN) was proposed for osteosarcoma image segmentation. Three supervised side output modules were added to the residual network. The shallow side output module could extract image shape features, such as edge features and texture features. The deep side output module could extract semantic features. The side output module could compute the loss value between output probability map and ground truth and back-propagate the loss information. Then, the parameters of residual network could be modified by gradient descent method. This could guide the multi-scale feature learning of the network. The final segmentation results were obtained by fusing the results output by the three side output modules. A total of 1900 CT images from 15 osteosarcoma patients were used to train the network and a total of 405 CT images from another 8 osteosarcoma patients were used to test the network. Results indicated that MSRN enabled a dice similarity coefficient (DSC) of 89.22%, a sensitivity of 88.74% and a F1-measure of 0.9305, which were larger than those obtained by fully convolutional network (FCN) and U-net. Thus, MSRN for osteosarcoma segmentation could give more accurate results than FCN and U Net.
引用
收藏
页码:1 / 8
页数:8
相关论文
共 35 条
[11]   Bone tumor segmentation from MR perfusion images with neural networks using multi-scale pharmacokinetic features [J].
Frangi, AF ;
Egmont-Petersen, M ;
Niessen, WJ ;
Reiber, JHC ;
Viergever, MA .
IMAGE AND VISION COMPUTING, 2001, 19 (9-10) :679-690
[12]   Hybrid artificial neural network segmentation and classification of dynamic contrast-enhanced MR imaging (DEMRI) of osteosarcoma [J].
Glass, JO ;
Reddick, WE .
MAGNETIC RESONANCE IMAGING, 1998, 16 (09) :1075-1083
[13]   Brain tumor segmentation with Deep Neural Networks [J].
Havaei, Mohammad ;
Davy, Axel ;
Warde-Farley, David ;
Biard, Antoine ;
Courville, Aaron ;
Bengio, Yoshua ;
Pal, Chris ;
Jodoin, Pierre-Marc ;
Larochelle, Hugo .
MEDICAL IMAGE ANALYSIS, 2017, 35 :18-31
[14]   Deep Residual Learning for Image Recognition [J].
He, Kaiming ;
Zhang, Xiangyu ;
Ren, Shaoqing ;
Sun, Jian .
2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, :770-778
[15]  
Jain YK., 2011, International Journal of Computer Communication Technology, V2, P45, DOI DOI 10.47893/IJCCT.2013.1201
[16]   Tumor necrosis rate adjusted by tumor volume change is a better predictor of survival of localized osteosarcoma patients [J].
Kim, Min Suk ;
Lee, Soo-Yong ;
Cho, Wan Hyeong ;
Song, Won Seok ;
Koh, Jae-Soo ;
Lee, Jun Ah ;
Yoo, Ji Young ;
Jeon, Dae-Geun .
ANNALS OF SURGICAL ONCOLOGY, 2008, 15 (03) :906-914
[17]   Semantic Image Segmentation via Deep Parsing Network [J].
Liu, Ziwei ;
Li, Xiaoxiao ;
Luo, Ping ;
Loy, Chen Change ;
Tang, Xiaoou .
2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2015, :1377-1385
[18]  
Long J, 2015, PROC CVPR IEEE, P3431, DOI 10.1109/CVPR.2015.7298965
[19]   Osteosarcoma treatment - Where do we stand? A state of the art review [J].
Luetke, Anja ;
Meyers, Paul A. ;
Lewis, Ian ;
Juergens, Heribert .
CANCER TREATMENT REVIEWS, 2014, 40 (04) :523-532
[20]  
Lyksborg Mark, 2015, Image Analysis. 19th Scandinavian Conference, SCIA 2015. Proceedings: LNCS 9127, P201, DOI 10.1007/978-3-319-19665-7_17