all AI news
Efficient Sampling on Riemannian Manifolds via Langevin MCMC
Feb. 19, 2024, 5:42 a.m. | Xiang Cheng, Jingzhao Zhang, Suvrit Sra
cs.LG updates on arXiv.org arxiv.org
Abstract: We study the task of efficiently sampling from a Gibbs distribution $d \pi^* = e^{-h} d {vol}_g$ over a Riemannian manifold $M$ via (geometric) Langevin MCMC; this algorithm involves computing exponential maps in random Gaussian directions and is efficiently implementable in practice. The key to our analysis of Langevin MCMC is a bound on the discretization error of the geometric Euler-Murayama scheme, assuming $\nabla h$ is Lipschitz and $M$ has bounded sectional curvature. Our error …
abstract algorithm analysis arxiv computing cs.lg distribution gibbs key manifold maps math.pr math.st mcmc practice random sampling stat.co stat.ml stat.th study the key type via
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Sr. BI Analyst
@ AkzoNobel | Pune, IN