Aug. 29, 2022, 1:12 a.m. | Fabio Sigrist

stat.ML updates on arXiv.org arxiv.org

We introduce a novel way to combine boosting with Gaussian process and mixed
effects models. This allows for relaxing, first, the zero or linearity
assumption for the prior mean function in Gaussian process and grouped random
effects models in a flexible non-parametric way and, second, the independence
assumption made in most boosting algorithms. The former is advantageous for
prediction accuracy and for avoiding model misspecifications. The latter is
important for efficient learning of the fixed effects predictor function and
for …

arxiv boosting lg process

More from arxiv.org / stat.ML updates on arXiv.org

ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)

@ HelloBetter | Remote

Doctoral Researcher (m/f/div) in Automated Processing of Bioimages

@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena

Seeking Developers and Engineers for AI T-Shirt Generator Project

@ Chevon Hicks | Remote

Security Data Engineer

@ ASML | Veldhoven, Building 08, Netherlands

Data Engineer

@ Parsons Corporation | Pune - Business Bay

Data Engineer

@ Parsons Corporation | Bengaluru, Velankani Tech Park