April 25, 2024, 7:42 p.m. | Qianyu Long, Qiyuan Wang, Christos Anagnostopoulos, Daning Bi

cs.LG updates on arXiv.org arxiv.org

arXiv:2404.15943v1 Announce Type: new
Abstract: Decentralized Federated Learning (DFL) has become popular due to its robustness and avoidance of centralized coordination. In this paradigm, clients actively engage in training by exchanging models with their networked neighbors. However, DFL introduces increased costs in terms of training and communication. Existing methods focus on minimizing communication often overlooking training efficiency and data heterogeneity. To address this gap, we propose a novel \textit{sparse-to-sparser} training scheme: DA-DPFL. DA-DPFL initializes with a subset of model parameters, …

abstract arxiv become communication costs cs.ai cs.lg decentralized federated learning focus however neighbors paradigm personalized popular robustness terms training type

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne