Web: http://arxiv.org/abs/2206.10783

June 23, 2022, 1:10 a.m. | Bin Yang, Thomas Carette, Masanobu Jimbo, Shinya Maruyama

cs.LG updates on arXiv.org arxiv.org

Federated Learning (FL) allows a number of agents to participate in training
a global machine learning model without disclosing locally stored data.
Compared to traditional distributed learning, the heterogeneity (non-IID) of
the agents slows down the convergence in FL. Furthermore, many datasets, being
too noisy or too small, are easily overfitted by complex models, such as deep
neural networks. Here, we consider the problem of using FL regression on noisy,
hierarchical and tabular datasets in which user distributions are significantly …

arxiv data hierarchical lg regression

More from arxiv.org / cs.LG updates on arXiv.org

Machine Learning Researcher - Saalfeld Lab

@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia

Project Director, Machine Learning in US Health

@ ideas42.org | Remote, US

Data Science Intern

@ NannyML | Remote

Machine Learning Engineer NLP/Speech

@ Play.ht | Remote

Research Scientist, 3D Reconstruction

@ Yembo | Remote, US

Clinical Assistant or Associate Professor of Management Science and Systems

@ University at Buffalo | Buffalo, NY