all AI news
Efficient and Light-Weight Federated Learning via Asynchronous Distributed Dropout. (arXiv:2210.16105v1 [cs.LG])
cs.LG updates on arXiv.org arxiv.org
Asynchronous learning protocols have regained attention lately, especially in
the Federated Learning (FL) setup, where slower clients can severely impede the
learning process. Herein, we propose \texttt{AsyncDrop}, a novel asynchronous
FL framework that utilizes dropout regularization to handle device
heterogeneity in distributed settings. Overall, \texttt{AsyncDrop} achieves
better performance compared to state of the art asynchronous methodologies,
while resulting in less communication and training time overheads. The key idea
revolves around creating ``submodels'' out of the global model, and
distributing their …
arxiv asynchronous distributed dropout federated learning light