Web: http://arxiv.org/abs/2102.12528

June 17, 2022, 1:11 a.m. | Constantin Philippenko, Aymeric Dieuleveut

cs.LG updates on arXiv.org arxiv.org

We develop a new approach to tackle communication constraints in a
distributed learning problem with a central server. We propose and analyze a
new algorithm that performs bidirectional compression and achieves the same
convergence rate as algorithms using only uplink (from the local workers to the
central server) compression. To obtain this improvement, we design MCM, an
algorithm such that the downlink compression only impacts local models, while
the global model is preserved. As a result, and contrary to previous …

arxiv compression distributed lg model

More from arxiv.org / cs.LG updates on arXiv.org

Machine Learning Researcher - Saalfeld Lab

@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia

Project Director, Machine Learning in US Health

@ ideas42.org | Remote, US

Data Science Intern

@ NannyML | Remote

Machine Learning Engineer NLP/Speech

@ Play.ht | Remote

Research Scientist, 3D Reconstruction

@ Yembo | Remote, US

Clinical Assistant or Associate Professor of Management Science and Systems

@ University at Buffalo | Buffalo, NY