Oct. 31, 2022, 1:13 a.m. | Zhengdao Chen, Eric Vanden-Eijnden, Joan Bruna

stat.ML updates on arXiv.org arxiv.org

To understand the training dynamics of neural networks (NNs), prior studies
have considered the infinite-width mean-field (MF) limit of two-layer NN,
establishing theoretical guarantees of its convergence under gradient flow
training as well as its approximation and generalization capabilities. In this
work, we study the infinite-width limit of a type of three-layer NN model whose
first layer is random and fixed. To define the limiting model rigorously, we
generalize the MF theory of two-layer NNs by treating the neurons as …

arxiv mean networks neural networks space theory

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Data Strategy & Management - Private Equity Sector - Manager - Consulting - Location OPEN

@ EY | New York City, US, 10001-8604

Data Engineer- People Analytics

@ Volvo Group | Gothenburg, SE, 40531