Oct. 20, 2022, 1:13 a.m. | Henry Li, Yuval Kluger

stat.ML updates on arXiv.org arxiv.org

We introduce a simple modification to the standard maximum likelihood
estimation (MLE) framework. Rather than maximizing a single unconditional
likelihood of the data under the model, we maximize a family of \textit{noise
conditional} likelihoods consisting of the data perturbed by a continuum of
noise levels. We find that models trained this way are more robust to noise,
obtain higher test likelihoods, and generate higher quality images. They can
also be sampled from via a novel score-based sampling scheme which combats …

arxiv likelihood maximum likelihood estimation modeling noise

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne