Feb. 13, 2024, 5:42 a.m. | Itay Safran Daniel Reichman Paul Valiant

cs.LG updates on arXiv.org arxiv.org

We prove an exponential separation between depth 2 and depth 3 neural networks, when approximating an $\mathcal{O}(1)$-Lipschitz target function to constant accuracy, with respect to a distribution with support in $[0,1]^{d}$, assuming exponentially bounded weights. This addresses an open problem posed in \citet{safran2019depth}, and proves that the curse of dimensionality manifests in depth 2 approximation, even in cases where the target function can be represented efficiently using depth 3. Previously, lower bounds that were used to separate depth 2 from …

accuracy cs.lg distribution function networks neural networks prove stat.ml support

Research Scholar (Technical Research)

@ Centre for the Governance of AI | Hybrid; Oxford, UK

HPC Engineer (x/f/m) - DACH

@ Meshcapade GmbH | Remote, Germany

Director of Machine Learning

@ Axelera AI | Hybrid/Remote - Europe (incl. UK)

Senior Data Scientist - Trendyol Milla

@ Trendyol | Istanbul (All)

Data Scientist, Mid

@ Booz Allen Hamilton | USA, CA, San Diego (1615 Murray Canyon Rd)

Systems Development Engineer , Amazon Robotics Business Applications and Solutions Engineering

@ Amazon.com | Boston, Massachusetts, USA