all AI news
Locking and Quacking: Stacking Bayesian model predictions by log-pooling and superposition. (arXiv:2305.07334v1 [stat.ML])
stat.ML updates on arXiv.org arxiv.org
Combining predictions from different models is a central problem in Bayesian
inference and machine learning more broadly. Currently, these predictive
distributions are almost exclusively combined using linear mixtures such as
Bayesian model averaging, Bayesian stacking, and mixture of experts. Such
linear mixtures impose idiosyncrasies that might be undesirable for some
applications, such as multi-modality. While there exist alternative strategies
(e.g. geometric bridge or superposition), optimising their parameters usually
involves computing an intractable normalising constant repeatedly. We present
two novel Bayesian …
arxiv bayesian bayesian inference experts inference linear machine machine learning mixture of experts pooling predictions predictive superposition