all AI news
Concentration inequalities and optimal number of layers for stochastic deep neural networks. (arXiv:2206.11241v1 [cs.LG])
Web: http://arxiv.org/abs/2206.11241
June 23, 2022, 1:12 a.m. | Michele Caprio, Sayan Mukherjee
stat.ML updates on arXiv.org arxiv.org
We state concentration and martingale inequalities for the output of the
hidden layers of a stochastic deep neural network (SDNN), as well as for the
output of the whole SDNN. These results allow us to introduce an expected
classifier (EC), and to give probabilistic upper bound for the classification
error of the EC. We also state the optimal number of layers for the SDNN via an
optimal stopping procedure. We apply our analysis to a stochastic version of a
feedforward …
More from arxiv.org / stat.ML updates on arXiv.org
Latest AI/ML/Big Data Jobs
Machine Learning Researcher - Saalfeld Lab
@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia
Project Director, Machine Learning in US Health
@ ideas42.org | Remote, US
Data Science Intern
@ NannyML | Remote
Machine Learning Engineer NLP/Speech
@ Play.ht | Remote
Research Scientist, 3D Reconstruction
@ Yembo | Remote, US
Clinical Assistant or Associate Professor of Management Science and Systems
@ University at Buffalo | Buffalo, NY