Feb. 15, 2024, 5:42 a.m. | Tao Yu, Congzheng Song, Jianyu Wang, Mona Chitnis

cs.LG updates on arXiv.org arxiv.org

arXiv:2402.09247v1 Announce Type: new
Abstract: Asynchronous protocols have been shown to improve the scalability of federated learning (FL) with a massive number of clients. Meanwhile, momentum-based methods can achieve the best model quality in synchronous FL. However, naively applying momentum in asynchronous FL algorithms leads to slower convergence and degraded model performance. It is still unclear how to effective combinie these two techniques together to achieve a win-win. In this paper, we find that asynchrony introduces implicit bias to momentum …

abstract algorithms approximation arxiv asynchronous convergence cs.lg federated learning leads massive performance quality scalability type

Doctoral Researcher (m/f/div) in Automated Processing of Bioimages

@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena

Seeking Developers and Engineers for AI T-Shirt Generator Project

@ Chevon Hicks | Remote

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Senior DevOps Engineer- Autonomous Database

@ Oracle | Reston, VA, United States