all AI news
On the non-universality of deep learning: quantifying the cost of symmetry. (arXiv:2208.03113v2 [cs.LG] UPDATED)
Oct. 17, 2022, 1:13 a.m. | Emmanuel Abbe, Enric Boix-Adsera
cs.LG updates on arXiv.org arxiv.org
We prove limitations on what neural networks trained by noisy gradient
descent (GD) can efficiently learn. Our results apply whenever GD training is
equivariant, which holds for many standard architectures and initializations.
As applications, (i) we characterize the functions that fully-connected
networks can weak-learn on the binary hypercube and unit sphere, demonstrating
that depth-2 is as powerful as any other depth for this task; (ii) we extend
the merged-staircase necessity result for learning with latent low-dimensional
structure [ABM22] to beyond …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Healthcare Data Modeler/Data Architect - REMOTE
@ Perficient | United States
Data Analyst – Sustainability, Green IT
@ H&M Group | Stockholm, Sweden
RWE Data Analyst
@ Sanofi | Hyderabad
Machine Learning Engineer
@ JPMorgan Chase & Co. | Jersey City, NJ, United States