all AI news
FedComLoc: Communication-Efficient Distributed Training of Sparse and Quantized Models
March 18, 2024, 4:41 a.m. | Kai Yi, Georg Meinhardt, Laurent Condat, Peter Richt\'arik
cs.LG updates on arXiv.org arxiv.org
Abstract: Federated Learning (FL) has garnered increasing attention due to its unique characteristic of allowing heterogeneous clients to process their private data locally and interact with a central server, while being respectful of privacy. A critical bottleneck in FL is the communication cost. A pivotal strategy to mitigate this burden is \emph{Local Training}, which involves running multiple local stochastic gradient descent iterations between communication phases. Our work is inspired by the innovative \emph{Scaffnew} algorithm, which has …
abstract arxiv attention communication cost cs.ai cs.dc cs.lg data distributed federated learning pivotal privacy private data process server strategy training type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Data Scientist (Database Development)
@ Nasdaq | Bengaluru-Affluence