Web: http://arxiv.org/abs/2201.12082

Jan. 31, 2022, 2:11 a.m. | Takashi Mori, Masahito Ueda

cs.LG updates on arXiv.org arxiv.org

It has been recognized that heavily overparameterized deep neural networks
(DNNs) exhibit surprisingly good generalization performance in various
machine-learning tasks. Although benefits of depth have been investigated from
different perspectives such as the approximation theory and the statistical
learning theory, existing theories do not adequately explain the empirical
success of overparameterized DNNs. In this work, we report a remarkable
interplay between depth and locality of a target function. We introduce
$k$-local and $k$-global functions, and find that depth is beneficial …

arxiv networks neural neural networks

More from arxiv.org / cs.LG updates on arXiv.org

Director, Data Engineering and Architecture

@ Chainalysis | California | New York | Washington DC | Remote - USA

Deep Learning Researcher

@ Topaz Labs | Dallas, TX

Sr Data Engineer (Contractor)

@ SADA | US - West

Senior Cloud Database Administrator

@ Findhelp | Remote

Senior Data Analyst

@ System1 | Remote

Speech Machine Learning Research Engineer

@ Samsung Research America | Mountain View, CA