all AI news
Scalability of Metropolis-within-Gibbs schemes for high-dimensional Bayesian models
March 15, 2024, 4:44 a.m. | Filippo Ascolani, Gareth O. Roberts, Giacomo Zanella
stat.ML updates on arXiv.org arxiv.org
Abstract: We study general coordinate-wise MCMC schemes (such as Metropolis-within-Gibbs samplers), which are commonly used to fit Bayesian non-conjugate hierarchical models. We relate their convergence properties to the ones of the corresponding (potentially not implementable) Gibbs sampler through the notion of conditional conductance. This allows us to study the performances of popular Metropolis-within-Gibbs schemes for non-conjugate hierarchical models, in high-dimensional regimes where both number of datapoints and parameters increase. Given random data-generating assumptions, we establish dimension-free …
abstract arxiv bayesian convergence general gibbs hierarchical math.st mcmc metropolis notion scalability stat.co stat.ml stat.th study through type wise
More from arxiv.org / stat.ML updates on arXiv.org
Mixture of partially linear experts
15 hours ago |
arxiv.org
Adaptive deep learning for nonlinear time series models
1 day, 15 hours ago |
arxiv.org
A Full Adagrad algorithm with O(Nd) operations
1 day, 15 hours ago |
arxiv.org
Minimax Regret Learning for Data with Heterogeneous Subgroups
1 day, 15 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote