April 26, 2024, 4:42 a.m. | Vlad-Raul Constantinescu, Ionel Popescu

cs.LG updates on arXiv.org arxiv.org

arXiv:2304.10552v2 Announce Type: replace
Abstract: In this paper, we prove that in the overparametrized regime, deep neural network provide universal approximations and can interpolate any data set, as long as the activation function is locally in $L^1(\RR)$ and not an affine function.
Additionally, if the activation function is smooth and such an interpolation networks exists, then the set of parameters which interpolate forms a manifold. Furthermore, we give a characterization of the Hessian of the loss function evaluated at the …

abstract approximation arxiv cs.lg data data set deep neural network function interpolation math.oc math.pr network networks neural network neural networks paper prove set stat.ml type universal

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

DevOps Engineer (Data Team)

@ Reward Gateway | Sofia/Plovdiv