May 20, 2022, 1:12 a.m. | Blake Bordelon, Cengiz Pehlevan

cs.LG updates on arXiv.org arxiv.org

We analyze feature learning in infinite width neural networks trained with
gradient flow through a self-consistent dynamical field theory. We construct a
collection of deterministic dynamical order parameters which are inner-product
kernels for hidden unit activations and gradients in each layer at pairs of
time points, providing a reduced description of network activity through
training. These kernel order parameters collectively define the hidden layer
activation distribution, the evolution of the neural tangent kernel, and
consequently output predictions. For deep linear …

arxiv consistent evolution kernel ml networks neural networks theory

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Technology Consultant Master Data Management (w/m/d)

@ SAP | Walldorf, DE, 69190

Research Engineer, Computer Vision, Google Research

@ Google | Nairobi, Kenya