all AI news
FedDistill: Global Model Distillation for Local Model De-Biasing in Non-IID Federated Learning
April 16, 2024, 4:41 a.m. | Changlin Song, Divya Saxena, Jiannong Cao, Yuqing Zhao
cs.LG updates on arXiv.org arxiv.org
Abstract: Federated Learning (FL) is a novel approach that allows for collaborative machine learning while preserving data privacy by leveraging models trained on decentralized devices. However, FL faces challenges due to non-uniformly distributed (non-iid) data across clients, which impacts model performance and its generalization capabilities. To tackle the non-iid issue, recent efforts have utilized the global model as a teaching mechanism for local models. However, our pilot study shows that their effectiveness is constrained by imbalanced …
abstract arxiv challenges collaborative cs.ai cs.cv cs.lg data data privacy decentralized devices distillation distributed federated learning global however impacts machine machine learning model distillation novel performance privacy type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
MLOps Engineer - Hybrid Intelligence
@ Capgemini | Madrid, M, ES
Analista de Business Intelligence (Industry Insights)
@ NielsenIQ | Cotia, Brazil