May 15, 2023, 12:42 a.m. | Yuling Yao, Luiz Max Carvalho, Diego Mesquita, Yann McLatchie

stat.ML updates on arXiv.org arxiv.org

Combining predictions from different models is a central problem in Bayesian
inference and machine learning more broadly. Currently, these predictive
distributions are almost exclusively combined using linear mixtures such as
Bayesian model averaging, Bayesian stacking, and mixture of experts. Such
linear mixtures impose idiosyncrasies that might be undesirable for some
applications, such as multi-modality. While there exist alternative strategies
(e.g. geometric bridge or superposition), optimising their parameters usually
involves computing an intractable normalising constant repeatedly. We present
two novel Bayesian …

arxiv bayesian bayesian inference experts inference linear machine machine learning mixture of experts pooling predictions predictive superposition

More from arxiv.org / stat.ML updates on arXiv.org

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Principal Applied Scientist

@ Microsoft | Redmond, Washington, United States

Data Analyst / Action Officer

@ OASYS, INC. | OASYS, INC., Pratt Avenue Northwest, Huntsville, AL, United States