Feb. 28, 2024, 5:42 a.m. | Daniel Iong, Matthew McAnear, Yuezhou Qu, Shasha Zou, Gabor Toth Yang Chen

cs.LG updates on arXiv.org arxiv.org

arXiv:2402.17570v1 Announce Type: new
Abstract: Gaussian Processes (GP) have become popular machine learning methods for kernel based learning on datasets with complicated covariance structures. In this paper, we present a novel extension to the GP framework using a contaminated normal likelihood function to better account for heteroscedastic variance and outlier noise. We propose a scalable inference algorithm based on the Sparse Variational Gaussian Process (SVGP) method for fitting sparse Gaussian process regression models with contaminated normal noise on large datasets. …

abstract arxiv become covariance cs.lg datasets extension forecasting framework function gaussian processes kernel likelihood machine machine learning noise normal novel paper popular process processes regression stat.ap stat.me type variance

Senior Machine Learning Engineer

@ GPTZero | Toronto, Canada

Customer Data Analyst with Spanish

@ Michelin | Voluntari

HC Data Analyst - Senior

@ Leidos | 1662 Intelligence Community Campus - Bethesda MD

Healthcare Research & Data Analyst- Infectious, Niche, Rare Disease

@ Clarivate | Remote (121- Massachusetts)

Data Analyst (maternity leave cover)

@ Clarivate | R155-Belgrade

Sales Enablement Data Analyst (Remote)

@ CrowdStrike | USA TX Remote