Convergence for score-based generative modeling with polynomial complexity
Advances in Neural Information Processing Systems 35 (2022), 22870--22882 Score-based generative modeling (SGM) is a highly successful approach for learning a probability distribution from data and generating further samples. We prove the first polynomial convergence guarantees for the core mechanic...
Saved in:
Main Authors | , , |
---|---|
Format | Journal Article |
Language | English |
Published |
13.06.2022
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Advances in Neural Information Processing Systems 35 (2022),
22870--22882 Score-based generative modeling (SGM) is a highly successful approach for
learning a probability distribution from data and generating further samples.
We prove the first polynomial convergence guarantees for the core mechanic
behind SGM: drawing samples from a probability density $p$ given a score
estimate (an estimate of $\nabla \ln p$) that is accurate in $L^2(p)$. Compared
to previous works, we do not incur error that grows exponentially in time or
that suffers from a curse of dimensionality. Our guarantee works for any smooth
distribution and depends polynomially on its log-Sobolev constant. Using our
guarantee, we give a theoretical analysis of score-based generative modeling,
which transforms white-noise input into samples from a learned data
distribution given score estimates at different noise scales. Our analysis
gives theoretical grounding to the observation that an annealed procedure is
required in practice to generate good samples, as our proof depends essentially
on using annealing to obtain a warm start at each step. Moreover, we show that
a predictor-corrector algorithm gives better convergence than using either
portion alone. |
---|---|
DOI: | 10.48550/arxiv.2206.06227 |