Boosting Noise Reduction Effect via Unsupervised Fine-Tuning Strategy

被引:1
|
作者
Jiang, Xinyi [1 ]
Xu, Shaoping [1 ]
Wu, Junyun [1 ]
Zhou, Changfei [1 ]
Ji, Shuichen [1 ]
机构
[1] Nanchang Univ, Sch Math & Comp Sci, Nanchang 330031, Peoples R China
来源
APPLIED SCIENCES-BASEL | 2024年 / 14卷 / 05期
关键词
boosting denoising effect; supervised denoising models; data bias; unsupervised denoising models; flexibility; fine-tuning; IMAGE; SPARSE;
D O I
10.3390/app14051742
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Over the last decade, supervised denoising models, trained on extensive datasets, have exhibited remarkable performance in image denoising, owing to their superior denoising effects. However, these models exhibit limited flexibility and manifest varying degrees of degradation in noise reduction capability when applied in practical scenarios, particularly when the noise distribution of a given noisy image deviates from that of the training images. To tackle this problem, we put forward a two-stage denoising model that is actualized by attaching an unsupervised fine-tuning phase after a supervised denoising model processes the input noisy image and secures a denoised image (regarded as a preprocessed image). More specifically, in the first stage we replace the convolution block adopted by the U-shaped network framework (utilized in the deep image prior method) with the Transformer module, and the resultant model is referred to as a U-Transformer. The U-Transformer model is trained to preprocess the input noisy images using noisy images and their labels. As for the second stage, we condense the supervised U-Transformer model into a simplified version, incorporating only one Transformer module with fewer parameters. Additionally, we shift its training mode to unsupervised training, following a similar approach as employed in the deep image prior method. This stage aims to further eliminate minor residual noise and artifacts present in the preprocessed image, resulting in clearer and more realistic output images. Experimental results illustrate that the proposed method achieves significant noise reduction in both synthetic and real images, surpassing state-of-the-art methods. This superiority stems from the supervised model's ability to rapidly process given noisy images, while the unsupervised model leverages its flexibility to generate a fine-tuned network, enhancing noise reduction capability. Moreover, with support from the supervised model providing higher-quality preprocessed images, the proposed unsupervised fine-tuning model requires fewer parameters, facilitating rapid training and convergence, resulting in overall high execution efficiency.
引用
收藏
页数:19
相关论文
共 38 条
  • [11] Improving Pretrained Language Model Fine-Tuning With Noise Stability Regularization
    Hua, Hang
    Li, Xingjian
    Dou, Dejing
    Xu, Cheng-Zhong
    Luo, Jiebo
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (01) : 1898 - 1910
  • [12] Boosting image denoising effect via low-level noise injection
    Xiao, Jian
    Cheng, Xiaohui
    Xu, Shaoping
    Tao, Wuyong
    Xiao, Yanyang
    SIGNAL IMAGE AND VIDEO PROCESSING, 2024, 18 (02) : 1053 - 1067
  • [13] Boosting image denoising effect via low-level noise injection
    Jian Xiao
    Xiaohui Cheng
    Shaoping Xu
    Wuyong Tao
    Yanyang Xiao
    Signal, Image and Video Processing, 2024, 18 : 1053 - 1067
  • [14] Efficient Neural Network Fine-Tuning via Layer Contribution Analysis
    Liu, Zhizhuo
    Zhou, Nanjian
    Liu, Min
    Liu, Zhibang
    Xu, Chaonong
    ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT IV, ICIC 2024, 2024, 14865 : 350 - 361
  • [15] Improving the Accuracy of Chili Leaf Disease Classification with ResNet and Fine-Tuning Strategy
    Rahman, Sayuti
    Setyadi, Rahmat Arief
    Indrawati, Asmah
    Sembiring, Arnes
    Zen, Muhammad
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2024, 15 (10) : 247 - 255
  • [16] Online Seizure Prediction via Fine-Tuning and Test-Time Adaptation
    Mao, Tingting
    Li, Chang
    Song, Rencheng
    Xu, Guoping
    Chen, Xun
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (11): : 20784 - 20796
  • [17] Fine-Tuning Channel-Pruned Deep Model via Knowledge Distillation
    Zhang, Chong
    Wang, Hong-Zhi
    Liu, Hong-Wei
    Chen, Yi-Lin
    JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY, 2024, 39 (06) : 1238 - 1247
  • [18] Full fine-tuning strategy for endoscopic foundation models with expanded learnable offset parameters
    Dong, Minghan
    Zheng, Xiangwei
    Zhang, Xia
    Zhang, Xingyu
    Zhang, Mingzhe
    BIOMEDICAL PHYSICS & ENGINEERING EXPRESS, 2025, 11 (02):
  • [19] Multi-classification of Breast Cancer Histology Images by Using a Fine-Tuning Strategy
    Brancati, Nadia
    Frucci, Maria
    Riccio, Daniel
    IMAGE ANALYSIS AND RECOGNITION (ICIAR 2018), 2018, 10882 : 771 - 778
  • [20] Improving BERT Fine-Tuning via Self-Ensemble and Self-Distillation
    Xu, Yi-Ge
    Qiu, Xi-Peng
    Zhou, Li-Gao
    Huang, Xuan-Jing
    JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY, 2023, 38 (04) : 853 - 866