Web: http://arxiv.org/abs/2205.01470

May 4, 2022, 1:11 a.m. | Zhigang Yan, Dong Li, Zhichao Zhang, Jiguang He

cs.LG updates on arXiv.org arxiv.org

In federated learning (FL), a number of devices train their local models and
upload the corresponding parameters or gradients to the base station (BS) to
update the global model while protecting their data privacy. However, due to
the limited computation and communication resources, the number of local
trainings (a.k.a. local update) and that of aggregations (a.k.a. global update)
need to be carefully chosen. In this paper, we investigate and analyze the
optimal trade-off between the number of local trainings and …

arxiv communication federated learning global learning

More from arxiv.org / cs.LG updates on arXiv.org

Data Analyst, Patagonia Action Works

@ Patagonia | Remote

Data & Insights Strategy & Innovation General Manager

@ Chevron Services Company, a division of Chevron U.S.A Inc. | Houston, TX

Faculty members in Research areas such as Bayesian and Spatial Statistics; Data Privacy and Security; AI/ML; NLP; Image and Video Data Analysis

@ Ahmedabad University | Ahmedabad, India

Director, Applied Mathematics & Computational Research Division

@ Lawrence Berkeley National Lab | Berkeley, Ca

Business Data Analyst

@ MainStreet Family Care | Birmingham, AL

Assistant/Associate Professor of the Practice in Business Analytics

@ Georgetown University McDonough School of Business | Washington DC