Deep-learning-based image registration and automatic segmentation of organs-at-risk in cone-beam CT scans from high-dose radiation treatment of pancreatic cancer

被引:27
|
作者
Han, Xu [1 ]
Hong, Jun [2 ]
Reyngold, Marsha [3 ]
Crane, Christopher [3 ]
Cuaron, John [3 ]
Hajj, Carla [3 ]
Mann, Justin [3 ]
Zinovoy, Melissa [3 ]
Greer, Hastings [1 ]
Yorke, Ellen [2 ]
Mageras, Gig [2 ]
Niethammer, Marc [1 ]
机构
[1] Univ N Carolina, Dept Comp Sci, Chapel Hill, NC 27599 USA
[2] Mem Sloan Kettering Canc Ctr, Dept Med Phys, New York, NY 10065 USA
[3] Mem Sloan Kettering Canc Ctr, Dept Radiat Oncol, New York, NY 10065 USA
基金
美国国家卫生研究院;
关键词
cone-beam CT; deformable image registration; machine learning; pancreatic cancer; THERAPY; FLOWS;
D O I
10.1002/mp.14906
中图分类号
R8 [特种医学]; R445 [影像诊断学];
学科分类号
1002 ; 100207 ; 1009 ;
摘要
Purpose: Accurate deformable registration between computed tomography (CT) and cone-beam CT (CBCT) images of pancreatic cancer patients treated with high biologically effective radiation doses is essential to assess changes in organ-at-risk (OAR) locations and shapes and to compute delivered dose. This study describes the development and evaluation of a deep-learning (DL) registration model to predict OAR segmentations on the CBCT derived from segmentations on the planning CT. Methods: The DL model is trained with CT-CBCT image pairs of the same patient, on which OAR segmentations of the small bowel, stomach, and duodenum have been manually drawn. A transformation map is obtained, which serves to warp the CT image and segmentations. In addition to a regularity loss and an image similarity loss, an OAR segmentation similarity loss is also used during training, which penalizes the mismatch between warped CT segmentations and manually drawn CBCT segmentations. At test time, CBCT segmentations are not required as they are instead obtained from the warped CT segmentations. In an IRB-approved retrospective study, a dataset consisting of 40 patients, each with one planning CT and two CBCT scans, was used in a fivefold cross-validation to train and evaluate the model, using physician-drawn segmentations as reference. Images were pre-processed to remove gas pockets. Network performance was compared to two intensity-based deformable registration algorithms (large deformation diffeomorphic metric mapping [LDDMM] and multimodality free-form [MMFF]) as baseline. Evaluated metrics were Dice similarity coefficient (DSC), change in OAR volume within a volume of interest (enclosing the low-dose PTV plus 1 cm margin) from planning CT to CBCT, and maximum dose to 5 cm(3) of the OAR [D(5cc)]. Results: Processing time for one CT-CBCT registration with the DL model at test time was less than 5 seconds on a GPU-based system, compared to an average of 30 minutes for LDDMM optimization. For both small bowel and stomach/duodenum, the DL model yielded larger median DSC and smaller interquartile variation than either MMFF (paired t-test P < 10(-4) for both type of OARs) or LDDMM (P < 10(-3) and P = 0.03 respectively). Root-mean-square deviation (RMSD) of DL-predicted change in small bowel volume relative to reference was 22% less than for MMFF (P = 0.007). RMSD of DL-predicted stomach/duodenum volume change was 28% less than for LDDMM (P = 0.0001). RMSD of DL-predicted D(5cc) in small bowel was 39% less than for MMFF (P = 0.001); in stomach/duodenum, RMSD of DL-predicted D(5cc) was 18% less than for LDDMM (P < 10(-3)). Conclusions: The proposed deep network CT-to-CBCT deformable registration model shows improved segmentation accuracy compared to intensity-based algorithms and achieves an order-of-magnitude reduction in processing time. (C) 2021 American Association of Physicists in Medicine
引用
收藏
页码:3084 / 3095
页数:12
相关论文
共 31 条
  • [1] Deep-Learning-based Segmentation of Organs-at-Risk in the Head for MR-assisted Radiation Therapy Planning
    Rusko, Laszlo
    Capala, Marta E.
    Czipczer, Vanda
    Kolozsvari, Bernadett
    Deak-Karancsi, Borbala
    Czabany, Renata
    Gyalai, Bence
    Tan, Tao
    Vegvary, Zoltan
    Borzasi, Emoke
    Egyud, Zsofia
    Koszo, Renata
    Paczona, Viktor
    Fodor, Emese
    Bobb, Chad
    Cozzini, Cristina
    Kaushik, Sandeep
    Darazs, Barbara
    Verduijn, Gerda M.
    Pearson, Rachel
    Maxwell, Ross
    Mccallum, Hazel
    Tamames, Juan A. Hernandez
    Hideghety, Katalin
    Petit, Steven F.
    Wiesinger, Florian
    BIOIMAGING: PROCEEDINGS OF THE 14TH INTERNATIONAL JOINT CONFERENCE ON BIOMEDICAL ENGINEERING SYSTEMS AND TECHNOLOGIES - VOL. 2: BIOIMAGING, 2021, : 31 - 43
  • [2] Deformable image registration for contour propagation from CT to cone-beam CT scans in radiotherapy of prostate cancer
    Thor, Maria
    Petersen, Jorgen B. B.
    Bentzen, Lise
    Hoyer, Morten
    Muren, Ludvig Paul
    ACTA ONCOLOGICA, 2011, 50 (06) : 918 - 925
  • [3] A Deep-Learning-Based Model for Predicting Dose Volume Histograms of Organs-At-Risk in Radiotherapy Treatment Plans
    Liu, Z.
    Chen, X.
    Men, K.
    Yi, J.
    Dai, J.
    MEDICAL PHYSICS, 2020, 47 (06) : E446 - E446
  • [4] Deep learning based direct segmentation assisted by deformable image registration for cone-beam CT based auto-segmentation for adaptive radiotherapy
    Liang, Xiao
    Morgan, Howard
    Bai, Ti
    Dohopolski, Michael
    Nguyen, Dan
    Jiang, Steve
    PHYSICS IN MEDICINE AND BIOLOGY, 2023, 68 (04):
  • [5] Progressively refined deep joint registration segmentation (ProRSeg) of gastrointestinal organs at risk: Application to MRI and cone-beam CT
    Jiang, Jue
    Hong, Jun
    Tringale, Kathryn
    Reyngold, Marsha
    Crane, Christopher
    Tyagi, Neelam
    Veeraraghavan, Harini
    MEDICAL PHYSICS, 2023, 50 (08) : 4758 - 4774
  • [6] Progressively Refined Joint Registration-Segmentation (ProSeg) of Gastrointestinal Organs at Risk From Cone-Beam CT
    Jiang, J.
    Hong, J.
    Tyagi, N.
    Reyngold, M.
    Crane, C.
    Veeraraghavan, H.
    MEDICAL PHYSICS, 2022, 49 (06) : E407 - E408
  • [7] Exploring the Combination of Deep-Learning Based Direct Segmentation and Deformable Image Registration for Cone-Beam CT Based AutoSegmentation for Adaptive Radiotherapy
    Liang, X.
    Morgan, H.
    Bai, T.
    Dohopolski, M.
    Nguyen, D.
    Jiang, S.
    MEDICAL PHYSICS, 2022, 49 (06) : E673 - E673
  • [8] Evaluation of Deep Learning-Based Auto-Segmentation of Organs-at-Risk for Breast Cancer Radiation Therapy
    Byun, H. K.
    Chang, J. S.
    Choi, M. S.
    Chun, J.
    Jung, J.
    Jeong, C.
    Kim, J. S.
    Chang, Y.
    Lee, S.
    Kim, Y. B.
    INTERNATIONAL JOURNAL OF RADIATION ONCOLOGY BIOLOGY PHYSICS, 2021, 111 (03): : E108 - E108
  • [9] Validation of clinical acceptability of deep-learning-based automated segmentation of organs-at-risk for head-and-neck radiotherapy treatment planning
    Lucido, J. John
    DeWees, Todd A.
    Leavitt, Todd R.
    Anand, Aman
    Beltran, Chris J.
    Brooke, Mark D.
    Buroker, Justine R.
    Foote, Robert L.
    Foss, Olivia R.
    Gleason, Angela M.
    Hodge, Teresa L.
    Hughes, Cian O.
    Hunzeker, Ashley E.
    Laack, Nadia N.
    Lenz, Tamra K.
    Livne, Michelle
    Morigami, Megumi
    Moseley, Douglas J.
    Undahl, Lisa M.
    Patel, Yojan
    Tryggestad, Erik J.
    Walker, Megan Z.
    Zverovitch, Alexei
    Patel, Samir H.
    FRONTIERS IN ONCOLOGY, 2023, 13
  • [10] Evaluation of 4D Cone Beam CT-Based Automatic Image Registration for Radiation Treatment of Lung Cancer
    Li, J.
    Harrison, A.
    Yu, Y.
    Xiao, Y.
    Werner-Wasik, M.
    Lu, B.
    INTERNATIONAL JOURNAL OF RADIATION ONCOLOGY BIOLOGY PHYSICS, 2013, 87 (02): : S675 - S675