all AI news
FedMT: Federated Learning with Mixed-type Labels
Feb. 15, 2024, 5:43 a.m. | Qiong Zhang, Aline Talhouk, Gang Niu, Xiaoxiao Li
cs.LG updates on arXiv.org arxiv.org
Abstract: In federated learning (FL), classifiers (e.g., deep networks) are trained on datasets from multiple centers without exchanging data across them, and thus improves sample efficiency. In the classical setting of FL, the same labeling criterion is usually employed across all centers being involved in training. This constraint greatly limits the applicability of FL. For example, standards used for disease diagnosis are more likely to be different across clinical centers, which mismatches the classical FL setting. …
abstract arxiv classifiers criterion cs.ai cs.dc cs.lg data datasets efficiency federated learning labeling labels mixed multiple networks sample them training type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York