July 1, 2022, 1:11 a.m. | Florian Graf, Sebastian Zeng, Bastian Rieck, Marc Niethammer, Roland Kwitt

stat.ML updates on arXiv.org arxiv.org

We study the excess capacity of deep networks in the context of supervised
classification. That is, given a capacity measure of the underlying hypothesis
class -- in our case, empirical Rademacher complexity -- by how much can we (a
priori) constrain this class while retaining an empirical error on a par with
the unconstrained regime? To assess excess capacity in modern architectures
(such as residual networks), we extend and unify prior Rademacher complexity
bounds to accommodate function composition and addition, …

arxiv capacity lg networks neural networks

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Business Intelligence Developer / Analyst

@ Transamerica | Work From Home, USA

Data Analyst (All Levels)

@ Noblis | Bethesda, MD, United States