all AI news
Beyond IID weights: sparse and low-rank deep Neural Networks are also Gaussian Processes
March 19, 2024, 4:45 a.m. | Thiziri Nait-Saada, Alireza Naderi, Jared Tanner
cs.LG updates on arXiv.org arxiv.org
Abstract: The infinitely wide neural network has been proven a useful and manageable mathematical model that enables the understanding of many phenomena appearing in deep learning. One example is the convergence of random deep networks to Gaussian processes that allows a rigorous analysis of the way the choice of activation function and network weights impacts the training dynamics. In this paper, we extend the seminal proof of Matthews et al. (2018) to a larger class of …
abstract analysis arxiv beyond convergence cs.lg deep learning example gaussian processes low network networks neural network neural networks processes random stat.ml type understanding
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Intern Large Language Models Planning (f/m/x)
@ BMW Group | Munich, DE
Data Engineer Analytics
@ Meta | Menlo Park, CA | Remote, US