Web: http://arxiv.org/abs/2206.08516

June 20, 2022, 1:10 a.m. | Yiqiang Chen, Wang Lu, Xin Qin, Jindong Wang, Xing Xie

cs.LG updates on arXiv.org arxiv.org

Federated learning has attracted increasing attention to building models
without accessing the raw user data, especially in healthcare. In real
applications, different federations can seldom work together due to possible
reasons such as data heterogeneity and distrust/inexistence of the central
server. In this paper, we propose a novel framework called MetaFed to
facilitate trustworthy FL between different federations. MetaFed obtains a
personalized model for each federation without a central server via the
proposed Cyclic Knowledge Distillation. Specifically, MetaFed treats each …

arxiv distillation federated learning healthcare knowledge learning lg personalized

More from arxiv.org / cs.LG updates on arXiv.org

Machine Learning Researcher - Saalfeld Lab

@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia

Project Director, Machine Learning in US Health

@ ideas42.org | Remote, US

Data Science Intern

@ NannyML | Remote

Machine Learning Engineer NLP/Speech

@ Play.ht | Remote

Research Scientist, 3D Reconstruction

@ Yembo | Remote, US

Clinical Assistant or Associate Professor of Management Science and Systems

@ University at Buffalo | Buffalo, NY