June 27, 2022, 1:10 a.m. | Stefano Favaro, Sandra Fortini, Stefano Peluchetti

cs.LG updates on arXiv.org arxiv.org

In modern deep learning, there is a recent and growing literature on the
interplay between large-width asymptotic properties of deep Gaussian neural
networks (NNs), i.e. deep NNs with Gaussian-distributed weights, and Gaussian
stochastic processes (SPs). Such an interplay has proved to be critical in
Bayesian inference under Gaussian SP priors, kernel regression for infinitely
wide deep NNs trained via gradient descent, and information propagation within
infinitely wide NNs. Motivated by empirical analyses that show the potential of
replacing Gaussian distributions …

arxiv convergence lg networks neural networks

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Data Strategy & Management - Private Equity Sector - Manager - Consulting - Location OPEN

@ EY | New York City, US, 10001-8604

Data Engineer- People Analytics

@ Volvo Group | Gothenburg, SE, 40531