Web: http://arxiv.org/abs/2206.10936

June 23, 2022, 1:10 a.m. | Masanari Kimura, Hideitsu Hino

cs.LG updates on arXiv.org arxiv.org

Dropout is one of the most popular regularization techniques in neural
network training. Because of its power and simplicity of idea, dropout has been
analyzed extensively and many variants have been proposed. In this paper,
several properties of dropout are discussed in a unified manner from the
viewpoint of information geometry. We showed that dropout flattens the model
manifold and that their regularization performance depends on the amount of the
curvature. Then, we showed that dropout essentially corresponds to a …

arxiv dropout geometry information ml training

More from arxiv.org / cs.LG updates on arXiv.org

Machine Learning Researcher - Saalfeld Lab

@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia

Project Director, Machine Learning in US Health

@ ideas42.org | Remote, US

Data Science Intern

@ NannyML | Remote

Machine Learning Engineer NLP/Speech

@ Play.ht | Remote

Research Scientist, 3D Reconstruction

@ Yembo | Remote, US

Clinical Assistant or Associate Professor of Management Science and Systems

@ University at Buffalo | Buffalo, NY