all AI news
Stochastic Coded Federated Learning with Convergence and Privacy Guarantees. (arXiv:2201.10092v1 [cs.LG])
Jan. 26, 2022, 2:11 a.m. | Yuchang Sun, Jiawei Shao, Songze Li, Yuyi Mao, Jun Zhang
cs.LG updates on arXiv.org arxiv.org
Federated learning (FL) has attracted much attention as a privacy-preserving
distributed machine learning framework, where many clients collaboratively
train a machine learning model by exchanging model updates with a parameter
server instead of sharing their raw data. Nevertheless, FL training suffers
from slow convergence and unstable performance due to stragglers caused by the
heterogeneous computational resources of clients and fluctuating communication
rates. This paper proposes a coded FL framework, namely *stochastic coded
federated learning* (SCFL) to mitigate the straggler issue. …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Data Analyst
@ Aviva | UK - Norwich - Carrara - 1st Floor
Werkstudent im Bereich Performance Engineering mit Computer Vision (w/m/div.) - anteilig remote
@ Bosch Group | Stuttgart, Lollar, Germany
Applied Research Scientist - NLP (Senior)
@ Snorkel AI | Hybrid / San Francisco, CA
Associate Principal Engineer, Machine Learning
@ Nagarro | Remote, India