all AI news
On Measuring Excess Capacity in Neural Networks. (arXiv:2202.08070v2 [cs.LG] UPDATED)
July 1, 2022, 1:11 a.m. | Florian Graf, Sebastian Zeng, Bastian Rieck, Marc Niethammer, Roland Kwitt
stat.ML updates on arXiv.org arxiv.org
We study the excess capacity of deep networks in the context of supervised
classification. That is, given a capacity measure of the underlying hypothesis
class -- in our case, empirical Rademacher complexity -- by how much can we (a
priori) constrain this class while retaining an empirical error on a par with
the unconstrained regime? To assess excess capacity in modern architectures
(such as residual networks), we extend and unify prior Rademacher complexity
bounds to accommodate function composition and addition, …
More from arxiv.org / stat.ML updates on arXiv.org
Estimation Sample Complexity of a Class of Nonlinear Continuous-time Systems
2 days, 19 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Senior Business Intelligence Developer / Analyst
@ Transamerica | Work From Home, USA
Data Analyst (All Levels)
@ Noblis | Bethesda, MD, United States