all AI news
Nearly $d$-Linear Convergence Bounds for Diffusion Models via Stochastic Localization
March 7, 2024, 5:43 a.m. | Joe Benton, Valentin De Bortoli, Arnaud Doucet, George Deligiannidis
cs.LG updates on arXiv.org arxiv.org
Abstract: Denoising diffusions are a powerful method to generate approximate samples from high-dimensional data distributions. Recent results provide polynomial bounds on their convergence rate, assuming $L^2$-accurate scores. Until now, the tightest bounds were either superlinear in the data dimension or required strong smoothness assumptions. We provide the first convergence bounds which are linear in the data dimension (up to logarithmic factors) assuming only finite second moments of the data distribution. We show that diffusion models require …
abstract arxiv assumptions convergence cs.lg data denoising diffusion diffusion models generate linear localization polynomial rate results samples stat.ml stochastic type via
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Data Science Analyst
@ Mayo Clinic | AZ, United States
Sr. Data Scientist (Network Engineering)
@ SpaceX | Redmond, WA