all AI news
Adaptive Worker Grouping For Communication-Efficient and Straggler-Tolerant Distributed SGD. (arXiv:2201.04301v1 [cs.IT])
Jan. 13, 2022, 2:10 a.m. | Feng Zhu, Jingjing Zhang, Osvaldo Simeone, Xin Wang
cs.LG updates on arXiv.org arxiv.org
Wall-clock convergence time and communication load are key performance
metrics for the distributed implementation of stochastic gradient descent (SGD)
in parameter server settings. Communication-adaptive distributed Adam (CADA)
has been recently proposed as a way to reduce communication load via the
adaptive selection of workers. CADA is subject to performance degradation in
terms of wall-clock convergence time in the presence of stragglers. This paper
proposes a novel scheme named grouping-based CADA (G-CADA) that retains the
advantages of CADA in reducing the …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
IT Commercial Data Analyst - ESO
@ National Grid | Warwick, GB, CV34 6DA
Stagiaire Data Analyst – Banque Privée - Juillet 2024
@ Rothschild & Co | Paris (Messine-29)
Operations Research Scientist I - Network Optimization Focus
@ CSX | Jacksonville, FL, United States
Machine Learning Operations Engineer
@ Intellectsoft | Baku, Baku, Azerbaijan - Remote
Data Analyst
@ Health Care Service Corporation | Richardson Texas HQ (1001 E. Lookout Drive)