all AI news
Gradient-enhanced deep Gaussian processes for multifidelity modelling
Feb. 27, 2024, 5:43 a.m. | Viv Bone, Chris van der Heide, Kieran Mackle, Ingo H. J. Jahn, Peter M. Dower, Chris Manzie
cs.LG updates on arXiv.org arxiv.org
Abstract: Multifidelity models integrate data from multiple sources to produce a single approximator for the underlying process. Dense low-fidelity samples are used to reduce interpolation error, while sparse high-fidelity samples are used to compensate for bias or noise in the low-fidelity samples. Deep Gaussian processes (GPs) are attractive for multifidelity modelling as they are non-parametric, robust to overfitting, perform well for small datasets, and, critically, can capture nonlinear and input-dependent relationships between data of different fidelities. …
abstract arxiv bias cs.lg data error fidelity gaussian processes gps gradient low modelling multiple noise process processes reduce samples stat.ml type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Robotics Technician - 3rd Shift
@ GXO Logistics | Perris, CA, US, 92571