March 28, 2024, 4:42 a.m. | Jannis Chemseddine, Paul Hagemann, Christian Wald, Gabriele Steidl

cs.LG updates on arXiv.org arxiv.org

arXiv:2403.18705v1 Announce Type: new
Abstract: In inverse problems, many conditional generative models approximate the posterior measure by minimizing a distance between the joint measure and its learned approximation. While this approach also controls the distance between the posterior measures in the case of the Kullback--Leibler divergence, this is in general not hold true for the Wasserstein distance. In this paper, we introduce a conditional Wasserstein distance via a set of restricted couplings that equals the expected Wasserstein distance of the …

abstract applications approximation arxiv bayesian case cs.lg divergence flow general generative generative models math.oc posterior type

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne