A Transformer-Based Iterative Reconstruction Model for Sparse-View CT Reconstruction

被引:12
|
作者
Xia, Wenjun [1 ]
Yang, Ziyuan [1 ]
Zhou, Qizheng [2 ]
Lu, Zexin [1 ]
Wang, Zhongxian [1 ]
Zhang, Yi [1 ]
机构
[1] Sichuan Univ, Chengdu 610065, Peoples R China
[2] SUNY Stony Brook, New York, NY 11794 USA
来源
MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION, MICCAI 2022, PT VI | 2022年 / 13436卷
关键词
Computed tomography; Image reconstruction; Deep learning; Nonlocal regularization; Transformer; LOW-DOSE CT; CONVOLUTIONAL NEURAL-NETWORK; DISTANCE-DRIVEN PROJECTION; IMAGE-RECONSTRUCTION; INVERSE PROBLEMS; NET;
D O I
10.1007/978-3-031-16446-0_75
中图分类号
TB8 [摄影技术];
学科分类号
0804 ;
摘要
Sparse-view computed tomography (CT) is one of the primary means to reduce the radiation risk. But the reconstruction of sparse-view CT will be contaminated by severe artifacts. By carefully designing the regularization terms, the iterative reconstruction (IR) algorithm can achieve promising results. With the introduction of deep learning techniques, learned regularization terms with convolution neural network (CNN) attracts much attention and can further improve the performance. In this paper, we propose a learned local-nonlocal regularization-based model called RegFormer to reconstruct CT images. Specifically, we unroll the iterative scheme into a neural network and replace handcrafted regularization terms with learnable kernels. The convolution layers are used to learn local regularization with excellent denoising performance. Simultaneously, transformer encoders and decoders incorporate the learned nonlocal prior into the model, preserving the structures and details. To improve the ability to extract deep features during iteration, we introduce an iteration transmission (IT) module, which can further promote the efficiency of each iteration. The experimental results show that our proposed RegFormer achieves competitive performance in artifact reduction and detail preservation compared to some state-of-the-art sparse-view CT reconstruction methods.
引用
收藏
页码:790 / 800
页数:11
相关论文
empty
未找到相关数据