all AI news
Decentralized Federated Learning: Balancing Communication and Computing Costs. (arXiv:2107.12048v3 [cs.LG] UPDATED)
Jan. 20, 2022, 2:11 a.m. | Wei Liu, Li Chen, Wenyi Zhang
cs.LG updates on arXiv.org arxiv.org
Decentralized stochastic gradient descent (SGD) is a driving engine for
decentralized federated learning (DFL). The performance of decentralized SGD is
jointly influenced by inter-node communications and local updates. In this
paper, we propose a general DFL framework, which implements both multiple local
updates and multiple inter-node communications periodically, to strike a
balance between communication efficiency and model consensus. It can provide a
general decentralized SGD analytical framework. We establish strong convergence
guarantees for the proposed DFL algorithm without the assumption …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Data Analyst (CPS-GfK)
@ GfK | Bucharest
Consultant Data Analytics IT Digital Impulse - H/F
@ Talan | Paris, France
Data Analyst
@ Experian | Mumbai, India
Data Scientist
@ Novo Nordisk | Princeton, NJ, US
Data Architect IV
@ Millennium Corporation | United States