A Score-Based Deterministic Diffusion Algorithm with Smooth Scores for General Distributions

Authors

  • Karthik Elamvazhuthi University of California, Riverside
  • Xuechen Zhang University of California, Riverside
  • Matthew Jacobs University of California, Santa Barbara
  • Samet Oymak University of Michigan
  • Fabio Pasqualetti University of California, Riverside

DOI:

https://doi.org/10.1609/aaai.v38i11.29072

Keywords:

ML: Deep Generative Models & Autoencoders

Abstract

Score matching based diffusion has shown to achieve the state of art results in generation modeling. In the original score matching based diffusion algorithm, the forward equation is a differential equation for which the probability density equation evolves according to a linear partial differential equation, the Fokker-Planck equation. A drawback of this approach is that one needs the data distribution to have a Lipschitz logarithmic gradient. This excludes a large class of data distributions that have a compact support. We present a deterministic diffusion process for which the vector fields are always Lipschitz and hence the score does not explode for probability measures with compact support. This deterministic diffusion process can be seen as a regularization of the porous media equation equation, which enables one to guarantee long term convergence of the forward process to the noise distribution. Though the porous media equation is itself not always guaranteed to have a Lipschitz vector field, it can be used to understand the closeness of the output of the algorithm to the data distribution as a function of the the time horizon and score matching error. This analysis enables us to show that the algorithm has better dependence on the score matching error than approaches based on stochastic diffusions. Using numerical experiments we verify our theoretical results on example one and two dimensional data distributions which are compactly supported. Additionally, we validate the approach on a modified MNIST data set for which the distribution is concentrated on a compact set. In each of the experiments, the approach using deterministic diffusion performs better that the diffusion algorithm with stochastic forward process, when considering the FID scores of the generated samples.

Downloads

Published

2024-03-24

How to Cite

Elamvazhuthi, K., Zhang, X., Jacobs, M., Oymak, S., & Pasqualetti, F. (2024). A Score-Based Deterministic Diffusion Algorithm with Smooth Scores for General Distributions. Proceedings of the AAAI Conference on Artificial Intelligence, 38(11), 11866-11873. https://doi.org/10.1609/aaai.v38i11.29072

Issue

Section

AAAI Technical Track on Machine Learning II