all AI news
Self-Consistent Dynamical Field Theory of Kernel Evolution in Wide Neural Networks. (arXiv:2205.09653v1 [stat.ML])
cs.LG updates on arXiv.org arxiv.org
We analyze feature learning in infinite width neural networks trained with
gradient flow through a self-consistent dynamical field theory. We construct a
collection of deterministic dynamical order parameters which are inner-product
kernels for hidden unit activations and gradients in each layer at pairs of
time points, providing a reduced description of network activity through
training. These kernel order parameters collectively define the hidden layer
activation distribution, the evolution of the neural tangent kernel, and
consequently output predictions. For deep linear …
arxiv consistent evolution kernel ml networks neural networks theory