all AI news
Sharp asymptotics on the compression of two-layer neural networks. (arXiv:2205.08199v4 [cs.IT] UPDATED)
Aug. 17, 2022, 1:11 a.m. | Mohammad Hossein Amani, Simone Bombari, Marco Mondelli, Rattana Pukdee, Stefano Rini
stat.ML updates on arXiv.org arxiv.org
In this paper, we study the compression of a target two-layer neural network
with N nodes into a compressed network with M<N nodes. More precisely, we
consider the setting in which the weights of the target network are i.i.d.
sub-Gaussian, and we minimize the population L_2 loss between the outputs of
the target and of the compressed network, under the assumption of Gaussian
inputs. By using tools from high-dimensional probability, we show that this
non-convex problem can be simplified when …
More from arxiv.org / stat.ML updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Business Intelligence Analyst
@ Rappi | COL-Bogotá
Applied Scientist II
@ Microsoft | Redmond, Washington, United States