all AI news
On Pre-Training for Federated Learning. (arXiv:2206.11488v1 [cs.LG])
Web: http://arxiv.org/abs/2206.11488
June 24, 2022, 1:12 a.m. | Hong-You Chen, Cheng-Hao Tu, Ziwei Li, Han-Wei Shen, Wei-Lun Chao
cs.CV updates on arXiv.org arxiv.org
In most of the literature on federated learning (FL), neural networks are
initialized with random weights. In this paper, we present an empirical study
on the effect of pre-training on FL. Specifically, we aim to investigate if
pre-training can alleviate the drastic accuracy drop when clients'
decentralized data are non-IID. We focus on FedAvg, the fundamental and most
widely used FL algorithm. We found that pre-training does largely close the gap
between FedAvg and centralized learning under non-IID data, but …
arxiv federated learning learning lg on pre-training training
More from arxiv.org / cs.CV updates on arXiv.org
Latest AI/ML/Big Data Jobs
Machine Learning Researcher - Saalfeld Lab
@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia
Project Director, Machine Learning in US Health
@ ideas42.org | Remote, US
Data Science Intern
@ NannyML | Remote
Machine Learning Engineer NLP/Speech
@ Play.ht | Remote
Research Scientist, 3D Reconstruction
@ Yembo | Remote, US
Clinical Assistant or Associate Professor of Management Science and Systems
@ University at Buffalo | Buffalo, NY