all AI news
Decentralized Personalized Federated Learning based on a Conditional Sparse-to-Sparser Scheme
April 25, 2024, 7:42 p.m. | Qianyu Long, Qiyuan Wang, Christos Anagnostopoulos, Daning Bi
cs.LG updates on arXiv.org arxiv.org
Abstract: Decentralized Federated Learning (DFL) has become popular due to its robustness and avoidance of centralized coordination. In this paradigm, clients actively engage in training by exchanging models with their networked neighbors. However, DFL introduces increased costs in terms of training and communication. Existing methods focus on minimizing communication often overlooking training efficiency and data heterogeneity. To address this gap, we propose a novel \textit{sparse-to-sparser} training scheme: DA-DPFL. DA-DPFL initializes with a subset of model parameters, …
abstract arxiv become communication costs cs.ai cs.lg decentralized federated learning focus however neighbors paradigm personalized popular robustness terms training type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US