all AI news
Towards Model Agnostic Federated Learning Using Knowledge Distillation. (arXiv:2110.15210v2 [cs.LG] UPDATED)
Web: http://arxiv.org/abs/2110.15210
May 12, 2022, 1:11 a.m. | Andrei Afonin, Sai Praneeth Karimireddy
cs.LG updates on arXiv.org arxiv.org
Is it possible to design an universal API for federated learning using which
an ad-hoc group of data-holders (agents) collaborate with each other and
perform federated learning? Such an API would necessarily need to be
model-agnostic i.e. make no assumption about the model architecture being used
by the agents, and also cannot rely on having representative public data at
hand. Knowledge distillation (KD) is the obvious tool of choice to design such
protocols. However, surprisingly, we show that most natural …
arxiv distillation federated learning knowledge learning model
More from arxiv.org / cs.LG updates on arXiv.org
Latest AI/ML/Big Data Jobs
Director, Applied Mathematics & Computational Research Division
@ Lawrence Berkeley National Lab | Berkeley, Ca
Business Data Analyst
@ MainStreet Family Care | Birmingham, AL
Assistant/Associate Professor of the Practice in Business Analytics
@ Georgetown University McDonough School of Business | Washington DC
Senior Data Science Writer
@ NannyML | Remote
Director of AI/ML Engineering
@ Armis Industries | Remote (US only), St. Louis, California
Digital Analytics Manager
@ Patagonia | Ventura, California