all AI news
Bayesian Federated Learning via Predictive Distribution Distillation. (arXiv:2206.07562v1 [cs.LG])
Web: http://arxiv.org/abs/2206.07562
June 16, 2022, 1:12 a.m. | Shrey Bhatt, Aishwarya Gupta, Piyush Rai
stat.ML updates on arXiv.org arxiv.org
For most existing federated learning algorithms, each round consists of
minimizing a loss function at each client to learn an optimal model at the
client, followed by aggregating these client models at the server. Point
estimation of the model parameters at the clients does not take into account
the uncertainty in the models estimated at each client. In many situations,
however, especially in limited data settings, it is beneficial to take into
account the uncertainty in the client models for …
arxiv bayesian distillation distribution federated learning learning lg predictive
More from arxiv.org / stat.ML updates on arXiv.org
Latest AI/ML/Big Data Jobs
Machine Learning Researcher - Saalfeld Lab
@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia
Project Director, Machine Learning in US Health
@ ideas42.org | Remote, US
Data Science Intern
@ NannyML | Remote
Machine Learning Engineer NLP/Speech
@ Play.ht | Remote
Research Scientist, 3D Reconstruction
@ Yembo | Remote, US
Clinical Assistant or Associate Professor of Management Science and Systems
@ University at Buffalo | Buffalo, NY