April 11, 2022, 1:11 a.m. | Yiqing Shen, Yuyin Zhou, Lequan Yu

cs.LG updates on arXiv.org arxiv.org

Federated learning (FL) is a distributed learning paradigm that enables
multiple clients to collaboratively learn a shared global model. Despite the
recent progress, it remains challenging to deal with heterogeneous data
clients, as the discrepant data distributions usually prevent the global model
from delivering good generalization ability on each participating client. In
this paper, we propose CD^2-pFed, a novel Cyclic Distillation-guided Channel
Decoupling framework, to personalize the global model in FL, under various
settings of data heterogeneity. Different from previous …

arxiv cd cv distillation federated learning learning personalization

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US