GraphRegNet: Deep Graph Regularisation Networks on Sparse Keypoints for Dense Registration of 3D Lung CTs

被引:49
作者
Hansen, Lasse [1 ]
Heinrich, Mattias P. [1 ]
机构
[1] Univ Lubeck, Inst Med Informat, D-23562 Lubeck, Germany
关键词
Lung; Computed tomography; Three-dimensional displays; Biomedical imaging; Feature extraction; Strain; Estimation; Deformable registration; graph learning; thoracic CT; DEFORMABLE IMAGE REGISTRATION; LEARNING FRAMEWORK; MOTION;
D O I
10.1109/TMI.2021.3073986
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
In the last two years learning-based methods have started to show encouraging results in different supervised and unsupervised medical image registration tasks. Deep neural networks enable (near) real time applications through fast inference times and have tremendous potential for increased registration accuracies by task-specific learning. However, estimation of large 3D deformations, for example present in inhale to exhale lung CT or interpatient abdominal MRI registration, is still a major challenge for the widely adopted U-Net-like network architectures. Even when using multi-level strategies, current state-of-the-art DL registration results do not yet reach the high accuracy of conventional frameworks. To overcome the problem of large deformations for deep learning approaches, in this work, we present GraphRegNet, a sparse keypoint-based geometric network for dense deformable medical image registration. Similar to the successful 2D optical flow estimation of FlowNet or PWC-Net we leverage discrete dense displacement maps to facilitate the registration process. In order to cope with enormously increasing memory requirements when working with displacement maps in 3D medical volumes and to obtain a well-regularised and accurate deformation field we 1) formulate the registration task as the prediction of displacement vectors on a sparse irregular grid of distinctive keypoints and 2) introduce our efficient GraphRegNet for displacement regularisation, a combination of convolutional and graph neural network layers in a unified architecture. In our experiments on exhale to inhale lung CT registration we demonstrate substantial improvements (TRE below 1.4 mm) over other deep learning methods. Our code is publicly available at https://github.com/multimodallearning/graphregnet.
引用
收藏
页码:2246 / 2257
页数:12
相关论文
共 60 条
[1]  
Aldous DJ., 1988, J THEORET PROBAB, V2, P91, DOI [10.1007/BF01048272, DOI 10.1007/BF01048272]
[2]  
[Anonymous], 2017, P IEEE C COMP VIS PA
[3]  
[Anonymous], 2007, 15 INT C US COMP RAD
[4]   VoxelMorph: A Learning Framework for Deformable Medical Image Registration [J].
Balakrishnan, Guha ;
Zhao, Amy ;
Sabuncu, Mert R. ;
Guttag, John ;
Dalca, Adrian, V .
IEEE TRANSACTIONS ON MEDICAL IMAGING, 2019, 38 (08) :1788-1800
[5]   Combining MRF-based deformable registration and deep binary 3D-CNN descriptors for large lung motion estimation in COPD patients [J].
Blendowski, Max ;
Heinrich, Mattias P. .
INTERNATIONAL JOURNAL OF COMPUTER ASSISTED RADIOLOGY AND SURGERY, 2019, 14 (01) :43-52
[6]   Geometric Deep Learning Going beyond Euclidean data [J].
Bronstein, Michael M. ;
Bruna, Joan ;
LeCun, Yann ;
Szlam, Arthur ;
Vandergheynst, Pierre .
IEEE SIGNAL PROCESSING MAGAZINE, 2017, 34 (04) :18-42
[7]   A reference dataset for deformable image registration spatial accuracy evaluation using the COPDgene study archive [J].
Castillo, Richard ;
Castillo, Edward ;
Fuentes, David ;
Ahmad, Moiz ;
Wood, Abbie M. ;
Ludwig, Michelle S. ;
Guerrero, Thomas .
PHYSICS IN MEDICINE AND BIOLOGY, 2013, 58 (09) :2861-2877
[8]   A framework for evaluation of deformable image registration spatial accuracy using large landmark point sets [J].
Castillo, Richard ;
Castillo, Edward ;
Guerra, Rudy ;
Johnson, Valen E. ;
McPhail, Travis ;
Garg, Amit K. ;
Guerrero, Thomas .
PHYSICS IN MEDICINE AND BIOLOGY, 2009, 54 (07) :1849-1870
[9]   Shape Completion using 3D-Encoder-Predictor CNNs and Shape Synthesis [J].
Dai, Angela ;
Qi, Charles Ruizhongtai ;
Niessner, Matthias .
30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, :6545-6554
[10]  
de Bruijne M., 2020, ARXIV200702319