all AI news
Mean-field analysis for heavy ball methods: Dropout-stability, connectivity, and global convergence. (arXiv:2210.06819v1 [cs.LG])
Oct. 14, 2022, 1:14 a.m. | Diyuan Wu, Vyacheslav Kungurtsev, Marco Mondelli
stat.ML updates on arXiv.org arxiv.org
The stochastic heavy ball method (SHB), also known as stochastic gradient
descent (SGD) with Polyak's momentum, is widely used in training neural
networks. However, despite the remarkable success of such algorithm in
practice, its theoretical characterization remains limited. In this paper, we
focus on neural networks with two and three layers and provide a rigorous
understanding of the properties of the solutions found by SHB: \emph{(i)}
stability after dropping out part of the neurons, \emph{(ii)} connectivity
along a low-loss path, …
More from arxiv.org / stat.ML updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Data Strategy & Management - Private Equity Sector - Manager - Consulting - Location OPEN
@ EY | New York City, US, 10001-8604
Data Engineer- People Analytics
@ Volvo Group | Gothenburg, SE, 40531