Bidirectional brain image translation using transfer learning from generic pre-trained models

被引:0
|
作者
Haimour, Fatima [1 ]
Al-Sayyed, Rizik [2 ]
Mahafza, Waleed [3 ]
Al-Kadi, Omar S. [2 ]
机构
[1] Zarqa Univ, Fac Informat Technol, Zarqa 13110, Jordan
[2] Univ Jordan, King Abdullah 2 Sch Informat Technol, Amman 11942, Jordan
[3] Jordan Univ Hosp, Dept Diagnost Radiol, Amman 11942, Jordan
关键词
Image translation; Transfer learning; Pre-trained model; Brain tumor; Magnetic resonance imaging; Computed tomography; CycleGAN;
D O I
10.1016/j.cviu.2024.104100
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Brain imaging plays a crucial role in the diagnosis and treatment of various neurological disorders, providing valuable insights into the structure and function of the brain. Techniques such as magnetic resonance imaging (MRI) and computed tomography (CT) enable non-invasive visualization of the brain, aiding in the understanding of brain anatomy, abnormalities, and functional connectivity. However, cost and radiation dose may limit the acquisition of specific image modalities, so medical image synthesis can be used to generate required medical images without actual addition. CycleGAN and other GANs are valuable tools for generating synthetic images across various fields. In the medical domain, where obtaining labeled medical images is labor-intensive and expensive, addressing data scarcity is a major challenge. Recent studies propose using transfer learning to overcome this issue. This involves adapting pre-trained CycleGAN models, initially trained on non-medical data, to generate realistic medical images. In this work, transfer learning was applied to the task of MR-CT image translation and vice versa using 18 pre-trained non-medical models, and the models were fine-tuned to have the best result. The models' performance was evaluated using four widely used image quality metrics: Peak-signal-to-noise-ratio, Structural Similarity Index, Universal Quality Index, and Visual Information Fidelity. Quantitative evaluation and qualitative perceptual analysis by radiologists demonstrate the potential of transfer learning in medical imaging and the effectiveness of the generic pre-trained model. The results provide compelling evidence of the model's exceptional performance, which can be attributed to the high quality and similarity of the training images to actual human brain images. These results underscore the significance of carefully selecting appropriate and representative training images to optimize performance in brain image analysis tasks.
引用
收藏
页数:15
相关论文
共 50 条
  • [31] Commonsense Knowledge Transfer for Pre-trained Language Models
    Zhou, Wangchunshu
    Le Bras, Ronan
    Choi, Yejin
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 5946 - 5960
  • [32] Multilingual Translation via Grafting Pre-trained Language Models
    Sun, Zewei
    Wang, Mingxuan
    Li, Lei
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 2735 - 2747
  • [33] Neural machine translation of clinical text: an empirical investigation into multilingual pre-trained language models and transfer-learning
    Han, Lifeng
    Gladkoff, Serge
    Erofeev, Gleb
    Sorokina, Irina
    Galiano, Betty
    Nenadic, Goran
    FRONTIERS IN DIGITAL HEALTH, 2024, 6
  • [34] PEIT: Bridging the Modality Gap with Pre-trained Models for End-to-End Image Translation
    Zhu, Shaolin
    Li, Shangjie
    Lei, Yikun
    Xiong, Deyi
    PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023): LONG PAPERS, VOL 1, 2023, : 13433 - 13447
  • [35] Unsupervised Representation Learning from Pre-trained Diffusion Probabilistic Models
    Zhang, Zijian
    Zhao, Zhou
    Lin, Zhijie
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [36] Learning Sample Difficulty from Pre-trained Models for Reliable Prediction
    Cui, Peng
    Zhang, Dan
    Deng, Zhijie
    Dong, Yinpeng
    Zhu, Jun
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [37] PTMA: Pre-trained Model Adaptation for Transfer Learning
    Li, Xiao
    Yan, Junkai
    Jiang, Jianjian
    Zheng, Wei-Shi
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT I, KSEM 2024, 2024, 14884 : 176 - 188
  • [38] Classification of Rice Leaf Diseases using CNN-based pre-trained models and transfer learning
    Mavaddat, Marjan
    Naderan, Marjan
    Alavi, Seyyed Enayatallah
    2023 6TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION AND IMAGE ANALYSIS, IPRIA, 2023,
  • [39] SAR Image Despeckling Using Pre-trained Convolutional Neural Network Models
    Yang, Xiangli
    Denis, Loic
    Tupin, Florence
    Yang, Wen
    2019 JOINT URBAN REMOTE SENSING EVENT (JURSE), 2019,
  • [40] Transfer Learning from Pre-trained Language Models Improves End-to-End Speech Summarization
    Matsuura, Kohei
    Ashihara, Takanori
    Moriya, Takafumi
    Tanaka, Tomohiro
    Kano, Takatomo
    Ogawa, Atsunori
    Delcroix, Marc
    INTERSPEECH 2023, 2023, : 2943 - 2947