A Score-Based Deterministic Diffusion Algorithm with Smooth Scores for General Distributions

被引:0
|
作者
Elamvazhuthi, Karthik [1 ]
Zhang, Xuechen
Jacobs, Matthew [2 ]
Oymak, Samet [3 ]
Pasqualetti, Fabio [1 ]
机构
[1] Univ Calif Riverside, Riverside, CA 92521 USA
[2] Univ Calif Santa Barbara, Santa Barbara, CA USA
[3] Univ Michigan, Ann Arbor, MI USA
来源
THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 11 | 2024年
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Score matching based diffusion has shown to achieve the state of art results in generation modeling. In the original score matching based diffusion algorithm, the forward equation is a differential equation for which the probability density equation evolves according to a linear partial differential equation, the Fokker-Planck equation. A drawback of this approach is that one needs the data distribution to have a Lipschitz logarithmic gradient. This excludes a large class of data distributions that have compact support. We present a deterministic diffusion process for which the vector fields are always Lipschitz, and hence the score does not explode for probability measures with compact support. This deterministic diffusion process can be seen as a regularization of the porous media equation, which enables one to guarantee long-term convergence of the forward process to the noise distribution. Though the porous media equation is itself not always guaranteed to have a Lipschitz vector field, it can be used to understand the closeness of the output of the algorithm to the data distribution as a function of the time horizon and score matching error. This analysis enables us to show that the algorithm has better dependence on the score matching error than approaches based on stochastic diffusions. Using numerical experiments we verify our theoretical results on example one and two dimensional data distributions which are compactly supported. Additionally, we validate the approach on modified versions of the MNIST and CIFAR-10 data sets for which the distribution is concentrated on a compact set. In each of the experiments, the approach using deterministic diffusion performs better that the diffusion algorithm with a stochastic forward process, when considering the FID scores of the generated samples.
引用
收藏
页码:11866 / 11873
页数:8
相关论文
共 50 条
  • [1] Convergence of score-based generative modeling for general data distributions
    Lee, Holden
    Lu, Jianfeng
    Tan, Yixin
    INTERNATIONAL CONFERENCE ON ALGORITHMIC LEARNING THEORY, VOL 201, 2023, 201 : 946 - 985
  • [2] Score-based diffusion models for accelerated MRI
    Chung, Hyungjin
    Ye, Jong Chul
    MEDICAL IMAGE ANALYSIS, 2022, 80
  • [3] A probabilistic model for score-based Algorithm Fusion
    Dobeck, Gerald J.
    OCEANS 2005, VOLS 1-3, 2005, : 2429 - 2434
  • [4] Score-Based Diffusion meets Annealed Importance Sampling
    Doucet, Arnaud
    Grathwohl, Will
    Matthews, Alexander G. D. G.
    Strathmann, Heiko
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [5] Maximum Likelihood Training of Score-Based Diffusion Models
    Song, Yang
    Durkan, Conor
    Murray, Iain
    Ermon, Stefano
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [6] SCORE-BASED DIFFUSION MODELS FOR BAYESIAN IMAGE RECONSTRUCTION
    McCann, Michael T.
    Chung, Hyungjin
    Ye, Jong Chul
    Klasky, Marc L.
    2023 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2023, : 111 - 115
  • [7] Robust score-based feature vectors for algorithm fusion
    Dobeck, GJ
    DETECTION AND REMEDIATION TECHNOLOGIES FOR MINES AND MINELIKE TARGETS IX, PTS 1 AND 2, 2004, 5415 : 304 - 314
  • [8] Score-based Image-to-Image Regression with Synchronized Diffusion
    Xin, Hao
    Zhu, Michael Yu
    2022 21ST IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS, ICMLA, 2022, : 345 - 352
  • [9] Model Abstraction and Conditional Sampling with Score-Based Diffusion Models
    Bortolussi, Luca
    Cairoli, Francesca
    Giacomarra, Francesco
    Scassola, Davide
    QUANTITATIVE EVALUATION OF SYSTEMS, QEST 2023, 2023, 14287 : 307 - 310
  • [10] Unifying GANs and Score-Based Diffusion as Generative Particle Models
    Franceschi, Jean-Yves
    Gartrell, Mike
    Dos Santos, Ludovic
    Issenhuth, Thibaut
    de Bezenac, Emmanuel
    Chen, Mickael
    Rakotomamonjy, Alain
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,