April 17, 2023, 8:05 p.m. | Filippo Ascolani, Giacomo Zanella

stat.ML updates on arXiv.org arxiv.org

Gibbs samplers are popular algorithms to approximate posterior distributions
arising from Bayesian hierarchical models. Despite their popularity and good
empirical performances, however, there are still relatively few quantitative
theoretical results on their scalability or lack thereof, e.g. much less than
for gradient-based sampling methods. We introduce a novel technique to analyse
the asymptotic behaviour of mixing times of Gibbs Samplers, based on tools of
Bayesian asymptotics. We apply our methodology to high dimensional hierarchical
models, obtaining dimension-free convergence results for …

algorithms apply arxiv assumptions bayesian complexity convergence data free gibbs good gradient hierarchical methodology novel popular posterior quantitative random sampling scalability through tools

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US