all AI news
Deep Horseshoe Gaussian Processes
March 5, 2024, 2:46 p.m. | Isma\"el Castillo, Thibault Randrianarisoa
stat.ML updates on arXiv.org arxiv.org
Abstract: Deep Gaussian processes have recently been proposed as natural objects to fit, similarly to deep neural networks, possibly complex features present in modern data samples, such as compositional structures. Adopting a Bayesian nonparametric approach, it is natural to use deep Gaussian processes as prior distributions, and use the corresponding posterior distributions for statistical inference. We introduce the deep Horseshoe Gaussian process Deep-HGP, a new simple prior based on deep Gaussian processes with a squared-exponential kernel, …
abstract arxiv bayesian data features gaussian processes math.st modern natural networks neural networks objects posterior prior processes samples stat.ml stat.th type
More from arxiv.org / stat.ML updates on arXiv.org
Learning linear dynamical systems under convex constraints
3 days, 23 hours ago |
arxiv.org
Inverse Unscented Kalman Filter
4 days, 23 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Machine Learning Research Scientist
@ d-Matrix | San Diego, Ca