all AI news
Improved Stability and Generalization Guarantees of the Decentralized SGD Algorithm
Feb. 15, 2024, 5:43 a.m. | Batiste Le Bars, Aur\'elien Bellet, Marc Tommasi, Kevin Scaman, Giovanni Neglia
cs.LG updates on arXiv.org arxiv.org
Abstract: This paper presents a new generalization error analysis for Decentralized Stochastic Gradient Descent (D-SGD) based on algorithmic stability. The obtained results overhaul a series of recent works that suggested an increased instability due to decentralization and a detrimental impact of poorly-connected communication graphs on generalization. On the contrary, we show, for convex, strongly convex and non-convex functions, that D-SGD can always recover generalization bounds analogous to those of classical SGD, suggesting that the choice of …
abstract algorithm analysis arxiv communication cs.lg decentralization decentralized error gradient graphs impact paper series stability stat.ml stochastic type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Consultant - Artificial Intelligence & Data (Google Cloud Data Engineer) - MY / TH
@ Deloitte | Kuala Lumpur, MY