Aug. 29, 2022, 1:12 a.m. | Fabio Sigrist

stat.ML updates on arXiv.org arxiv.org

We introduce a novel way to combine boosting with Gaussian process and mixed
effects models. This allows for relaxing, first, the zero or linearity
assumption for the prior mean function in Gaussian process and grouped random
effects models in a flexible non-parametric way and, second, the independence
assumption made in most boosting algorithms. The former is advantageous for
prediction accuracy and for avoiding model misspecifications. The latter is
important for efficient learning of the fixed effects predictor function and
for …

arxiv boosting lg process

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US