May 15, 2023, 12:42 a.m. | Yuling Yao, Luiz Max Carvalho, Diego Mesquita, Yann McLatchie

stat.ML updates on arXiv.org arxiv.org

Combining predictions from different models is a central problem in Bayesian
inference and machine learning more broadly. Currently, these predictive
distributions are almost exclusively combined using linear mixtures such as
Bayesian model averaging, Bayesian stacking, and mixture of experts. Such
linear mixtures impose idiosyncrasies that might be undesirable for some
applications, such as multi-modality. While there exist alternative strategies
(e.g. geometric bridge or superposition), optimising their parameters usually
involves computing an intractable normalising constant repeatedly. We present
two novel Bayesian …

arxiv bayesian bayesian inference experts inference linear machine machine learning mixture of experts pooling predictions predictive superposition

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US