all AI news
Universal characteristics of deep neural network loss surfaces from random matrix theory. (arXiv:2205.08601v1 [math-ph])
May 19, 2022, 1:11 a.m. | Nicholas P Baskerville, Jonathan P Keating, Francesco Mezzadri, Joseph Najnudel, Diego Granziol
cs.LG updates on arXiv.org arxiv.org
This paper considers several aspects of random matrix universality in deep
neural networks. Motivated by recent experimental work, we use universal
properties of random matrices related to local statistics to derive practical
implications for deep neural networks based on a realistic model of their
Hessians. In particular we derive universal aspects of outliers in the spectra
of deep neural networks and demonstrate the important role of random matrix
local laws in popular pre-conditioning gradient descent algorithms. We also
present insights …
arxiv deep neural network loss math network neural network random theory
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Senior Engineer - Data Science Operations
@ causaLens | London - Hybrid, England, United Kingdom
F0138 - LLM Developer (AI NLP)
@ Ubiquiti Inc. | Taipei
Staff Engineer, Database
@ Nagarro | Gurugram, India
Artificial Intelligence Assurance Analyst
@ Booz Allen Hamilton | USA, VA, McLean (8251 Greensboro Dr)