all AI news
Non-Vacuous Generalisation Bounds for Shallow Neural Networks. (arXiv:2202.01627v3 [cs.LG] UPDATED)
Web: http://arxiv.org/abs/2202.01627
June 16, 2022, 1:12 a.m. | Felix Biggs, Benjamin Guedj
stat.ML updates on arXiv.org arxiv.org
We focus on a specific class of shallow neural networks with a single hidden
layer, namely those with $L_2$-normalised data and either a sigmoid-shaped
Gaussian error function ("erf") activation or a Gaussian Error Linear Unit
(GELU) activation. For these networks, we derive new generalisation bounds
through the PAC-Bayesian theory; unlike most existing such bounds they apply to
neural networks with deterministic rather than randomised parameters. Our
bounds are empirically non-vacuous when the network is trained with vanilla
stochastic gradient descent …
More from arxiv.org / stat.ML updates on arXiv.org
Latest AI/ML/Big Data Jobs
Machine Learning Researcher - Saalfeld Lab
@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia
Project Director, Machine Learning in US Health
@ ideas42.org | Remote, US
Data Science Intern
@ NannyML | Remote
Machine Learning Engineer NLP/Speech
@ Play.ht | Remote
Research Scientist, 3D Reconstruction
@ Yembo | Remote, US
Clinical Assistant or Associate Professor of Management Science and Systems
@ University at Buffalo | Buffalo, NY