all AI news
Speeding up Heterogeneous Federated Learning with Sequentially Trained Superclients. (arXiv:2201.10899v1 [cs.LG])
Web: http://arxiv.org/abs/2201.10899
Jan. 27, 2022, 2:10 a.m. | Riccardo Zaccone, Andrea Rizzardi, Debora Caldarola, Marco Ciccone, Barbara Caputo
cs.LG updates on arXiv.org arxiv.org
Federated Learning (FL) allows training machine learning models in
privacy-constrained scenarios by enabling the cooperation of edge devices
without requiring local data sharing. This approach raises several challenges
due to the different statistical distribution of the local datasets and the
clients' computational heterogeneity. In particular, the presence of highly
non-i.i.d. data severely impairs both the performance of the trained neural
network and its convergence rate, increasing the number of communication rounds
requested to reach a performance comparable to that of …
More from arxiv.org / cs.LG updates on arXiv.org
Latest AI/ML/Big Data Jobs
Data Scientist
@ Fluent, LLC | Boca Raton, Florida, United States
Big Data ETL Engineer
@ Binance.US | Vancouver
Data Scientist / Data Engineer
@ Kin + Carta | Chicago
Data Engineer
@ Craft | Warsaw, Masovian Voivodeship, Poland
Senior Manager, Data Analytics Audit
@ Affirm | Remote US
Data Scientist - Nationwide Opportunities, AWS Professional Services
@ Amazon.com | US, NC, Virtual Location - N Carolina