Web: http://arxiv.org/abs/2206.08752

June 20, 2022, 1:10 a.m. | Fabiola Espinoza Castellon, Aurelien Mayoue, Jacques-Henri Sublemontier, Cedric Gouy-Pailler

cs.LG updates on arXiv.org arxiv.org

Federated learning enables different parties to collaboratively build a
global model under the orchestration of a server while keeping the training
data on clients' devices. However, performance is affected when clients have
heterogeneous data. To cope with this problem, we assume that despite data
heterogeneity, there are groups of clients who have similar data distributions
that can be clustered. In previous approaches, in order to cluster clients the
server requires clients to send their parameters simultaneously. However, this
can be …

arxiv clustering data federated learning incremental learning lg

More from arxiv.org / cs.LG updates on arXiv.org

Machine Learning Researcher - Saalfeld Lab

@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia

Project Director, Machine Learning in US Health

@ ideas42.org | Remote, US

Data Science Intern

@ NannyML | Remote

Machine Learning Engineer NLP/Speech

@ Play.ht | Remote

Research Scientist, 3D Reconstruction

@ Yembo | Remote, US

Clinical Assistant or Associate Professor of Management Science and Systems

@ University at Buffalo | Buffalo, NY