all AI news
A Computation and Communication Efficient Method for Distributed Nonconvex Problems in the Partial Participation Setting. (arXiv:2205.15580v2 [cs.LG] UPDATED)
Oct. 14, 2022, 1:13 a.m. | Alexander Tyurin, Peter Richtárik
cs.LG updates on arXiv.org arxiv.org
We present a new method that includes three key components of distributed
optimization and federated learning: variance reduction of stochastic
gradients, compressed communication, and partial participation. We prove that
the new method has optimal oracle complexity and state-of-the-art communication
complexity in the partial participation setting. Moreover, we observe that "1 +
1 + 1 is not 3": by mixing variance reduction of stochastic gradients with
compressed communication and partial participation, we do not obtain a fully
synergetic effect. We explain …
More from arxiv.org / cs.LG updates on arXiv.org
Generalized Schr\"odinger Bridge Matching
1 day, 5 hours ago |
arxiv.org
Tight bounds on Pauli channel learning without entanglement
1 day, 5 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Data Analyst - Associate
@ JPMorgan Chase & Co. | Mumbai, Maharashtra, India
Staff Data Engineer (Data Platform)
@ Coupang | Seoul, South Korea
AI/ML Engineering Research Internship
@ Keysight Technologies | Santa Rosa, CA, United States
Sr. Director, Head of Data Management and Reporting Execution
@ Biogen | Cambridge, MA, United States
Manager, Marketing - Audience Intelligence (Senior Data Analyst)
@ Delivery Hero | Singapore, Singapore