Web: http://arxiv.org/abs/2206.08829

June 20, 2022, 1:12 a.m. | Anis Elgabli, Chaouki Ben Issaid, Amrit S. Bedi, Ketan Rajawat, Mehdi Bennis, Vaneet Aggarwal

stat.ML updates on arXiv.org arxiv.org

Newton-type methods are popular in federated learning due to their fast
convergence. Still, they suffer from two main issues, namely: low communication
efficiency and low privacy due to the requirement of sending Hessian
information from clients to parameter server (PS). In this work, we introduced
a novel framework called FedNew in which there is no need to transmit Hessian
information from clients to PS, hence resolving the bottleneck to improve
communication efficiency. In addition, FedNew hides the gradient information
and …

arxiv communication federated learning learning lg privacy type

More from arxiv.org / stat.ML updates on arXiv.org

Machine Learning Researcher - Saalfeld Lab

@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia

Project Director, Machine Learning in US Health

@ ideas42.org | Remote, US

Data Science Intern

@ NannyML | Remote

Machine Learning Engineer NLP/Speech

@ Play.ht | Remote

Research Scientist, 3D Reconstruction

@ Yembo | Remote, US

Clinical Assistant or Associate Professor of Management Science and Systems

@ University at Buffalo | Buffalo, NY