all AI news
Federated Bayesian Deep Learning: The Application of Statistical Aggregation Methods to Bayesian Models
March 25, 2024, 4:41 a.m. | John Fischer, Marko Orescanin, Justin Loomis, Patrick McClure
cs.LG updates on arXiv.org arxiv.org
Abstract: Federated learning (FL) is an approach to training machine learning models that takes advantage of multiple distributed datasets while maintaining data privacy and reducing communication costs associated with sharing local datasets. Aggregation strategies have been developed to pool or fuse the weights and biases of distributed deterministic models; however, modern deterministic deep learning (DL) models are often poorly calibrated and lack the ability to communicate a measure of epistemic uncertainty in prediction, which is desirable …
abstract aggregation application arxiv bayesian bayesian deep learning communication costs cs.lg data data privacy datasets deep learning distributed federated learning machine machine learning machine learning models multiple pool privacy statistical stat.ml strategies training type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Research Scientist
@ Meta | Menlo Park, CA
Principal Data Scientist
@ Mastercard | O'Fallon, Missouri (Main Campus)