Aug. 23, 2022, 1:13 a.m. | Quynh Nguyen, Marco Mondelli, Guido Montufar

stat.ML updates on arXiv.org arxiv.org

A recent line of work has analyzed the theoretical properties of deep neural
networks via the Neural Tangent Kernel (NTK). In particular, the smallest
eigenvalue of the NTK has been related to the memorization capacity, the global
convergence of gradient descent algorithms and the generalization of deep nets.
However, existing results either provide bounds in the two-layer setting or
assume that the spectrum of the NTK matrices is bounded away from 0 for
multi-layer networks. In this paper, we provide …

arxiv eigenvalue kernel ml networks relu

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

AI Engineering Manager

@ M47 Labs | Barcelona, Catalunya [Cataluña], Spain