all AI news
Stein Variational Gaussian Processes. (arXiv:2009.12141v3 [stat.ML] UPDATED)
Jan. 20, 2022, 2:10 a.m. | Thomas Pinder, Christopher Nemeth, David Leslie
cs.LG updates on arXiv.org arxiv.org
We show how to use Stein variational gradient descent (SVGD) to carry out
inference in Gaussian process (GP) models with non-Gaussian likelihoods and
large data volumes. Markov chain Monte Carlo (MCMC) is extremely
computationally intensive for these situations, but the parametric assumptions
required for efficient variational inference (VI) result in incorrect inference
when they encounter the multi-modal posterior distributions that are common for
such models. SVGD provides a non-parametric alternative to variational
inference which is substantially faster than MCMC. We …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Social Insights & Data Analyst (Freelance)
@ Media.Monks | Jakarta
Cloud Data Engineer
@ Arkatechture | Portland, ME, USA