all AI news
On the Byzantine-Resilience of Distillation-Based Federated Learning
Feb. 20, 2024, 5:42 a.m. | Christophe Roux, Max Zimmer, Sebastian Pokutta
cs.LG updates on arXiv.org arxiv.org
Abstract: Federated Learning (FL) algorithms using Knowledge Distillation (KD) have received increasing attention due to their favorable properties with respect to privacy, non-i.i.d. data and communication cost. These methods depart from transmitting model parameters and, instead, communicate information about a learning task by sharing predictions on a public dataset. In this work, we study the performance of such approaches in the byzantine setting, where a subset of the clients act in an adversarial manner aiming to …
abstract algorithms arxiv attention communication cost cs.ai cs.dc cs.lg data dataset distillation federated learning information knowledge parameters predictions privacy public resilience type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Data Engineer (m/f/d)
@ Project A Ventures | Berlin, Germany
Principle Research Scientist
@ Analog Devices | US, MA, Boston