Feb. 6, 2024, 5:48 a.m. | Dun Zeng Zenglin Xu Yu Pan Xu Luo Qifan Wang Xiaoying Tang

cs.LG updates on arXiv.org arxiv.org

Federated Learning (FL) is a distributed learning paradigm to train a global model across multiple devices without collecting local data. In FL, a server typically selects a subset of clients for each training round to optimize resource usage. Central to this process is the technique of unbiased client sampling, which ensures a representative selection of clients. Current methods primarily utilize a random sampling procedure which, despite its effectiveness, achieves suboptimal efficiency owing to the loose upper bound caused by the …

client cs.lg data devices distributed distributed learning federated learning global multiple optimization paradigm process sampling server train training unbiased usage variance

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote