Linear fine-tuning: a linear transformation based transfer strategy for deep MRI reconstruction

被引:0
作者
Bi, Wanqing [1 ]
Xv, Jianan [1 ]
Song, Mengdie [1 ]
Hao, Xiaohan [1 ,2 ]
Gao, Dayong [3 ]
Qi, Fulang [1 ]
机构
[1] Univ Sci & Technol China, Ctr Biomed Engn, Hefei, Anhui, Peoples R China
[2] Fuqing Med Co Ltd, Hefei, Anhui, Peoples R China
[3] Univ Washington, Dept Mech Engn, Seattle, WA USA
关键词
magnetic resonance imaging reconstruction; deep learning; transfer learning; fine-tuning; transfer strategy; IMAGE; NETWORKS;
D O I
10.3389/fnins.2023.1202143
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
IntroductionFine-tuning (FT) is a generally adopted transfer learning method for deep learning-based magnetic resonance imaging (MRI) reconstruction. In this approach, the reconstruction model is initialized with pre-trained weights derived from a source domain with ample data and subsequently updated with limited data from the target domain. However, the direct full-weight update strategy can pose the risk of "catastrophic forgetting" and overfitting, hindering its effectiveness. The goal of this study is to develop a zero-weight update transfer strategy to preserve pre-trained generic knowledge and reduce overfitting. MethodsBased on the commonality between the source and target domains, we assume a linear transformation relationship of the optimal model weights from the source domain to the target domain. Accordingly, we propose a novel transfer strategy, linear fine-tuning (LFT), which introduces scaling and shifting (SS) factors into the pre-trained model. In contrast to FT, LFT only updates SS factors in the transfer phase, while the pre-trained weights remain fixed. ResultsTo evaluate the proposed LFT, we designed three different transfer scenarios and conducted a comparative analysis of FT, LFT, and other methods at various sampling rates and data volumes. In the transfer scenario between different contrasts, LFT outperforms typical transfer strategies at various sampling rates and considerably reduces artifacts on reconstructed images. In transfer scenarios between different slice directions or anatomical structures, LFT surpasses the FT method, particularly when the target domain contains a decreasing number of training images, with a maximum improvement of up to 2.06 dB (5.89%) in peak signal-to-noise ratio. DiscussionThe LFT strategy shows great potential to address the issues of "catastrophic forgetting" and overfitting in transfer scenarios for MRI reconstruction, while reducing the reliance on the amount of data in the target domain. Linear fine-tuning is expected to shorten the development cycle of reconstruction models for adapting complicated clinical scenarios, thereby enhancing the clinical applicability of deep MRI reconstruction.
引用
收藏
页数:12
相关论文
共 50 条
  • [21] Boosting Noise Reduction Effect via Unsupervised Fine-Tuning Strategy
    Jiang, Xinyi
    Xu, Shaoping
    Wu, Junyun
    Zhou, Changfei
    Ji, Shuichen
    APPLIED SCIENCES-BASEL, 2024, 14 (05):
  • [22] Forest Image Classification Based on Fine-Tuning CaffeNet
    Zhang G.
    Li Y.
    Wang H.
    Zhou H.
    Linye Kexue/Scientia Silvae Sinicae, 2020, 56 (10): : 121 - 128
  • [23] Comparison of fine-tuning strategies for transfer learning in medical image classification
    Davila, Ana
    Colan, Jacinto
    Hasegawa, Yasuhisa
    IMAGE AND VISION COMPUTING, 2024, 146
  • [24] High Accuracy Arrhythmia Classification using Transfer Learning with Fine-Tuning
    Aphale, Sayli
    Jha, Anshul
    John, Eugene
    2022 IEEE 13TH ANNUAL UBIQUITOUS COMPUTING, ELECTRONICS & MOBILE COMMUNICATION CONFERENCE (UEMCON), 2022, : 480 - 487
  • [25] A selective model for transfer learning in CNNs: optimization of fine-tuning layers
    Mallouk, Otmane
    Joudar, Nour-Eddine
    Ettaouil, Mohamed
    INTERNATIONAL JOURNAL OF DATA SCIENCE AND ANALYTICS, 2024,
  • [26] Transfer Learning and Fine-Tuning for Facial Expression Recognition with Class Balancing
    Ruzicka, Josef
    Lara, Adrian
    2024 L LATIN AMERICAN COMPUTER CONFERENCE, CLEI 2024, 2024,
  • [27] Convolutional Neural Network Ensemble Fine-Tuning for Extended Transfer Learning
    Korzh, Oxana
    Joaristi, Mikel
    Serra, Edoardo
    BIG DATA - BIGDATA 2018, 2018, 10968 : 110 - 123
  • [28] Fine-tuning deep learning model parameters for improved super-resolution of dynamic MRI with prior-knowledge
    Sarasaen, Chompunuch
    Chatterjee, Soumick
    Breitkopf, Mario
    Rose, Georg
    Nurnberger, Andreas
    Speck, Oliver
    ARTIFICIAL INTELLIGENCE IN MEDICINE, 2021, 121
  • [29] Transfer Learning for Sentiment Analysis Using BERT Based Supervised Fine-Tuning
    Prottasha, Nusrat Jahan
    Sami, Abdullah As
    Kowsher, Md
    Murad, Saydul Akbar
    Bairagi, Anupam Kumar
    Masud, Mehedi
    Baz, Mohammed
    SENSORS, 2022, 22 (11)
  • [30] Using a Fine-tuning method for a Deep authentication in Mobile Cloud Computing based on Tensorflow lite framework
    Zeroual, Abdelhakim
    Amroune, Mohamed
    Derdour, Makhlouf
    Bentahar, Atef
    2019 4TH INTERNATIONAL CONFERENCE ON NETWORKING AND ADVANCED SYSTEMS (ICNAS 2019), 2019, : 84 - 88