all AI news
Impact of Parameter Sparsity on Stochastic Gradient MCMC Methods for Bayesian Deep Learning. (arXiv:2202.03770v1 [cs.LG])
cs.LG updates on arXiv.org arxiv.org
Bayesian methods hold significant promise for improving the uncertainty
quantification ability and robustness of deep neural network models. Recent
research has seen the investigation of a number of approximate Bayesian
inference methods for deep neural networks, building on both the variational
Bayesian and Markov chain Monte Carlo (MCMC) frameworks. A fundamental issue
with MCMC methods is that the improvements they enable are obtained at the
expense of increased computation time and model storage costs. In this paper,
we investigate the …
arxiv bayesian bayesian deep learning deep learning gradient impact learning mcmc sparsity stochastic