all AI news
Coordinating Momenta for Cross-silo Federated Learning. (arXiv:2102.03970v2 [cs.LG] UPDATED)
Jan. 20, 2022, 2:11 a.m. | An Xu, Heng Huang
cs.LG updates on arXiv.org arxiv.org
Communication efficiency is crucial for federated learning (FL). Conducting
local training steps in clients to reduce the communication frequency between
clients and the server is a common method to address this issue. However, this
strategy leads to the client drift problem due to \textit{non-i.i.d.} data
distributions in different clients which severely deteriorates the performance.
In this work, we propose a new method to improve the training performance in
cross-silo FL via maintaining double momentum buffers. In our algorithm, one
momentum …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Data Analyst (Commercial Excellence)
@ Allegro | Poznan, Warsaw, Poland
Senior Machine Learning Engineer
@ Motive | Pakistan - Remote
Summernaut Customer Facing Data Engineer
@ Celonis | Raleigh, US, North Carolina
Data Engineer Mumbai
@ Nielsen | Mumbai, India