April 17, 2023, 8:02 p.m. | Tuo Zhang, Lei Gao, Sunwoo Lee, Mi Zhang, Salman Avestimehr

cs.LG updates on arXiv.org arxiv.org

In cross-device Federated Learning (FL) environments, scaling synchronous FL
methods is challenging as stragglers hinder the training process. Moreover, the
availability of each client to join the training is highly variable over time
due to system heterogeneities and intermittent connectivity. Recent
asynchronous FL methods (e.g., FedBuff) have been proposed to overcome these
issues by allowing slower users to continue their work on local training based
on stale models and to contribute to aggregation when ready. However, we show
empirically that …

aggregation arxiv asynchronous client connectivity environments federated learning intermittent join process scaling training work

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Data Engineer

@ Chubb | Simsbury, CT, United States

Research Analyst , NA Light Vehicle Powertrain Forecasting

@ S&P Global | US - MI - VIRTUAL

Sr. Data Scientist - ML Ops Job

@ Yash Technologies | Indore, IN

Alternance-Data Management

@ Keolis | Courbevoie, FR, 92400