all AI news
Communication-Efficient ADMM-based Federated Learning. (arXiv:2110.15318v2 [cs.LG] UPDATED)
Jan. 17, 2022, 2:11 a.m. | Shenglong Zhou, Geoffrey Ye Li
cs.LG updates on arXiv.org arxiv.org
Federated learning has shown its advances over the last few years but is
facing many challenges, such as how algorithms save communication resources,
how they reduce computational costs, and whether they converge. To address
these issues, this paper proposes exact and inexact ADMM-based federated
learning. They are not only communication-efficient but also converge linearly
under very mild conditions, such as convexity-free and irrelevance to data
distributions. Moreover, the inexact version has low computational complexity,
thereby alleviating the computational burdens significantly.
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Senior Computer Vision Engineer
@ Motive | Pakistan - Remote
Data Analyst III
@ Fanatics | New York City, United States
Senior Data Scientist - Experian Health (This role is remote, from anywhere in the U.S.)
@ Experian | ., ., United States
Senior Data Engineer
@ Springer Nature Group | Pune, IN