April 26, 2024, 4:42 a.m. | Vlad-Raul Constantinescu, Ionel Popescu

cs.LG updates on arXiv.org arxiv.org

arXiv:2304.10552v2 Announce Type: replace
Abstract: In this paper, we prove that in the overparametrized regime, deep neural network provide universal approximations and can interpolate any data set, as long as the activation function is locally in $L^1(\RR)$ and not an affine function.
Additionally, if the activation function is smooth and such an interpolation networks exists, then the set of parameters which interpolate forms a manifold. Furthermore, we give a characterization of the Hessian of the loss function evaluated at the …

abstract approximation arxiv cs.lg data data set deep neural network function interpolation math.oc math.pr network networks neural network neural networks paper prove set stat.ml type universal

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US