all AI news
Optimal Complexity in Decentralized Training. (arXiv:2006.08085v4 [cs.LG] UPDATED)
Web: http://arxiv.org/abs/2006.08085
Jan. 31, 2022, 2:11 a.m. | Yucheng Lu, Christopher De Sa
cs.LG updates on arXiv.org arxiv.org
Decentralization is a promising method of scaling up parallel machine
learning systems. In this paper, we provide a tight lower bound on the
iteration complexity for such methods in a stochastic non-convex setting. Our
lower bound reveals a theoretical gap in known convergence rates of many
existing decentralized training algorithms, such as D-PSGD. We prove by
construction this lower bound is tight and achievable. Motivated by our
insights, we further propose DeTAG, a practical gossip-style decentralized
algorithm that achieves the …
More from arxiv.org / cs.LG updates on arXiv.org
Latest AI/ML/Big Data Jobs
Senior Data Analyst
@ Fanatics Inc | Remote - New York
Data Engineer - Search
@ Cytora | United Kingdom - Remote
Product Manager, Technical - Data Infrastructure and Streaming
@ Nubank | Berlin
Postdoctoral Fellow: ML for autonomous materials discovery
@ Lawrence Berkeley National Lab | Berkeley, CA
Principal Data Scientist
@ Zuora | Remote
Data Engineer
@ Veeva Systems | Pennsylvania - Fort Washington