all AI news
Complexity of Gibbs samplers through Bayesian asymptotics. (arXiv:2304.06993v1 [stat.CO])
stat.ML updates on arXiv.org arxiv.org
Gibbs samplers are popular algorithms to approximate posterior distributions
arising from Bayesian hierarchical models. Despite their popularity and good
empirical performances, however, there are still relatively few quantitative
theoretical results on their scalability or lack thereof, e.g. much less than
for gradient-based sampling methods. We introduce a novel technique to analyse
the asymptotic behaviour of mixing times of Gibbs Samplers, based on tools of
Bayesian asymptotics. We apply our methodology to high dimensional hierarchical
models, obtaining dimension-free convergence results for …
algorithms apply arxiv assumptions bayesian complexity convergence data free gibbs good gradient hierarchical methodology novel popular posterior quantitative random sampling scalability through tools