Feb. 8, 2024, 5:45 a.m. | Stanislas StrasmanSU, LPSM Antonio OcelloCMAP Claire BoyerLPSM Sylvain Le CorffLPSM Vincent LemaireLPSM

stat.ML updates on arXiv.org arxiv.org

Score-based generative models (SGMs) aim at estimating a target data distribution by learning score functions using only noise-perturbed samples from the target. Recent literature has focused extensively on assessing the error between the target and estimated distributions, gauging the generative quality through the Kullback-Leibler (KL) divergence and Wasserstein distances. All existing results have been obtained so far for time-homogeneous speed of the noise schedule. Under mild assumptions on the data distribution, we establish an upper bound for the KL divergence …

aim analysis data distribution divergence error functions generative generative models literature math.st noise quality samples stat.ml stat.th through

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne