all AI news
Sampling is as easy as learning the score: theory for diffusion models with minimal data assumptions. (arXiv:2209.11215v1 [cs.LG])
cs.LG updates on arXiv.org arxiv.org
We provide theoretical convergence guarantees for score-based generative
models (SGMs) such as denoising diffusion probabilistic models (DDPMs), which
constitute the backbone of large-scale real-world generative models such as
DALL$\cdot$E 2. Our main result is that, assuming accurate score estimates,
such SGMs can efficiently sample from essentially any realistic data
distribution. In contrast to prior works, our results (1) hold for an
$L^2$-accurate score estimate (rather than $L^\infty$-accurate); (2) do not
require restrictive functional inequality conditions that preclude substantial
non-log-concavity; (3) …
arxiv assumptions data diffusion diffusion models easy sampling theory