Boosting Noise Reduction Effect via Unsupervised Fine-Tuning Strategy

被引:1
|
作者
Jiang, Xinyi [1 ]
Xu, Shaoping [1 ]
Wu, Junyun [1 ]
Zhou, Changfei [1 ]
Ji, Shuichen [1 ]
机构
[1] Nanchang Univ, Sch Math & Comp Sci, Nanchang 330031, Peoples R China
来源
APPLIED SCIENCES-BASEL | 2024年 / 14卷 / 05期
关键词
boosting denoising effect; supervised denoising models; data bias; unsupervised denoising models; flexibility; fine-tuning; IMAGE; SPARSE;
D O I
10.3390/app14051742
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Over the last decade, supervised denoising models, trained on extensive datasets, have exhibited remarkable performance in image denoising, owing to their superior denoising effects. However, these models exhibit limited flexibility and manifest varying degrees of degradation in noise reduction capability when applied in practical scenarios, particularly when the noise distribution of a given noisy image deviates from that of the training images. To tackle this problem, we put forward a two-stage denoising model that is actualized by attaching an unsupervised fine-tuning phase after a supervised denoising model processes the input noisy image and secures a denoised image (regarded as a preprocessed image). More specifically, in the first stage we replace the convolution block adopted by the U-shaped network framework (utilized in the deep image prior method) with the Transformer module, and the resultant model is referred to as a U-Transformer. The U-Transformer model is trained to preprocess the input noisy images using noisy images and their labels. As for the second stage, we condense the supervised U-Transformer model into a simplified version, incorporating only one Transformer module with fewer parameters. Additionally, we shift its training mode to unsupervised training, following a similar approach as employed in the deep image prior method. This stage aims to further eliminate minor residual noise and artifacts present in the preprocessed image, resulting in clearer and more realistic output images. Experimental results illustrate that the proposed method achieves significant noise reduction in both synthetic and real images, surpassing state-of-the-art methods. This superiority stems from the supervised model's ability to rapidly process given noisy images, while the unsupervised model leverages its flexibility to generate a fine-tuned network, enhancing noise reduction capability. Moreover, with support from the supervised model providing higher-quality preprocessed images, the proposed unsupervised fine-tuning model requires fewer parameters, facilitating rapid training and convergence, resulting in overall high execution efficiency.
引用
收藏
页数:19
相关论文
共 38 条
  • [21] Compressing BERT for Binary Text Classification via Adaptive Truncation before Fine-Tuning
    Zhang, Xin
    Fan, Jing
    Hei, Mengzhe
    APPLIED SCIENCES-BASEL, 2022, 12 (23):
  • [22] Improving BERT Fine-Tuning via Self-Ensemble and Self-Distillation
    Yi-Ge Xu
    Xi-Peng Qiu
    Li-Gao Zhou
    Xuan-Jing Huang
    Journal of Computer Science and Technology, 2023, 38 : 853 - 866
  • [23] An Autoregulated Fine-Tuning Strategy for Titer Improvement of Secondary Metabolites Using Native Promoters in Streptomyces
    Li, Shanshan
    Wang, Junyang
    Xiang, Wensheng
    Yang, Keqian
    Li, Zilong
    Wang, Weishan
    ACS SYNTHETIC BIOLOGY, 2018, 7 (02): : 522 - 530
  • [24] Receiver-Agnostic Radio Frequency Fingerprinting Based on Two-stage Unsupervised Domain Adaptation and Fine-tuning
    Bao, Jiazhong
    Xie, Xin
    Lu, Zhaoyi
    Hong, Jianan
    Hua, Cunqing
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 6085 - 6090
  • [25] Shade Artifact Reduction in CBCT-to-MDCT: Fine-Tuning Based on Style Transfer and Human Feedback
    Park, Hyun-Cheol
    Jeon, Kiwan
    Park, Hyoung Suk
    Kang, Sung Ho
    IEEE ACCESS, 2025, 13 : 49476 - 49488
  • [26] Toward Unified Data and Algorithm Fairness via Adversarial Data Augmentation and Adaptive Model Fine-tuning
    Zhang, Yanfu
    Bao, Runxue
    Pei, Jian
    Huang, Heng
    2022 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2022, : 1317 - 1322
  • [27] Novel Fine-Tuning Strategy on Pre-trained Protein Model Enhances ACP Functional Type Classification
    Wang, Shaokai
    Ma, Bin
    BIOINFORMATICS RESEARCH AND APPLICATIONS, PT I, ISBRA 2024, 2024, 14954 : 371 - 382
  • [28] Building robust deep recommender systems: Utilizing a weighted adversarial noise propagation framework with robust fine-tuning modules
    Qian, Fulan
    Chen, Wenbin
    Chen, Hai
    Liu, Jinggang
    Zhao, Shu
    Zhang, Yanping
    KNOWLEDGE-BASED SYSTEMS, 2025, 314
  • [29] Fine-Tuning of Near-Infrared Emission in Fe-Activated Spinel Phosphors via the Synergistic Effect of Sites Inversion and Atomic Disorder
    Ye, Yulong
    Yang, Heyi
    Liang, Liang
    Mao, Qinan
    Zhao, Fangyi
    Zhu, Yiwen
    Liu, Meijiao
    Zhong, Jiasong
    LASER & PHOTONICS REVIEWS, 2025, 19 (01)
  • [30] GFlow-FT: Pick a Child Network via Gradient Flow for Efficient Fine-Tuning in Recommendation Systems
    Ding, Ke
    He, Yong
    Dong, Xin
    Yang, Jieyu
    Zhang, Liang
    Li, Ang
    Zhang, Xiaolu
    Mo, Linjian
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 3918 - 3922