all AI news
Sharp asymptotics on the compression of two-layer neural networks. (arXiv:2205.08199v2 [cs.IT] UPDATED)
May 19, 2022, 1:12 a.m. | Mohammad Hossein Amani, Simone Bombari, Marco Mondelli, Rattana Pukdee, Stefano Rini
cs.LG updates on arXiv.org arxiv.org
In this paper, we study the compression of a target two-layer neural network
with N nodes into a compressed network with M < N nodes. More precisely, we
consider the setting in which the weights of the target network are i.i.d.
sub-Gaussian, and we minimize the population L2 loss between the outputs of the
target and of the compressed network, under the assumption of Gaussian inputs.
By using tools from high-dimensional probability, we show that this non-convex
problem can be …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Vice President, AI Product Manager
@ JPMorgan Chase & Co. | New York City, United States
Binance Accelerator Program - Data Engineer
@ Binance | Asia