all AI news
Optimising Communication Overhead in Federated Learning Using NSGA-II. (arXiv:2204.02183v1 [cs.NE])
April 6, 2022, 1:12 a.m. | José Ángel Morell, Zakaria Abdelmoiz Dahi, Francisco Chicano, Gabriel Luque, Enrique Alba
cs.LG updates on arXiv.org arxiv.org
Federated learning is a training paradigm according to which a server-based
model is cooperatively trained using local models running on edge devices and
ensuring data privacy. These devices exchange information that induces a
substantial communication load, which jeopardises the functioning efficiency.
The difficulty of reducing this overhead stands in achieving this without
decreasing the model's efficiency (contradictory relation). To do so, many
works investigated the compression of the pre/mid/post-trained models and the
communication rounds, separately, although they jointly contribute to …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Robotics Technician - 3rd Shift
@ GXO Logistics | Perris, CA, US, 92571