Oct. 5, 2022, 1:13 a.m. | Sitan Chen, Sinho Chewi, Jerry Li, Yuanzhi Li, Adil Salim, Anru R. Zhang

cs.LG updates on arXiv.org arxiv.org

We provide theoretical convergence guarantees for score-based generative
models (SGMs) such as denoising diffusion probabilistic models (DDPMs), which
constitute the backbone of large-scale real-world generative models such as
DALL$\cdot$E 2. Our main result is that, assuming accurate score estimates,
such SGMs can efficiently sample from essentially any realistic data
distribution. In contrast to prior works, our results (1) hold for an
$L^2$-accurate score estimate (rather than $L^\infty$-accurate); (2) do not
require restrictive functional inequality conditions that preclude substantial
non-log-concavity; (3) …

arxiv assumptions data diffusion diffusion models easy sampling theory

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Data Management Associate

@ EcoVadis | Ebène, Mauritius

Senior Data Engineer

@ Telstra | Telstra ICC Bengaluru