all AI news
$\pi$VAE: a stochastic process prior for Bayesian deep learning with MCMC. (arXiv:2002.06873v6 [cs.LG] UPDATED)
stat.ML updates on arXiv.org arxiv.org
Stochastic processes provide a mathematically elegant way model complex data.
In theory, they provide flexible priors over function classes that can encode a
wide range of interesting assumptions. In practice, however, efficient
inference by optimisation or marginalisation is difficult, a problem further
exacerbated with big data and high dimensional input spaces. We propose a novel
variational autoencoder (VAE) called the prior encoding variational autoencoder
($\pi$VAE). The $\pi$VAE is finitely exchangeable and Kolmogorov consistent,
and thus is a continuous stochastic process. …
arxiv bayesian bayesian deep learning deep learning mcmc prior process stochastic stochastic process