April 6, 2022, 1:12 a.m. | José Ángel Morell, Zakaria Abdelmoiz Dahi, Francisco Chicano, Gabriel Luque, Enrique Alba

cs.LG updates on arXiv.org arxiv.org

Federated learning is a training paradigm according to which a server-based
model is cooperatively trained using local models running on edge devices and
ensuring data privacy. These devices exchange information that induces a
substantial communication load, which jeopardises the functioning efficiency.
The difficulty of reducing this overhead stands in achieving this without
decreasing the model's efficiency (contradictory relation). To do so, many
works investigated the compression of the pre/mid/post-trained models and the
communication rounds, separately, although they jointly contribute to …

arxiv communication federated learning ii learning

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Robotics Technician - 3rd Shift

@ GXO Logistics | Perris, CA, US, 92571