Web: http://arxiv.org/abs/2201.11036

Sept. 16, 2022, 1:12 a.m. | Giacomo Verardo, Daniel Barreira, Marco Chiesa, Dejan Kostic, Gerald Q. Maguire Jr

cs.LG updates on arXiv.org arxiv.org

In cross-device Federated Learning (FL), clients with low computational power
train a common\linebreak[4] machine model by exchanging parameters via updates
instead of potentially private data. Federated Dropout (FD) is a technique that
improves the communication efficiency of a FL session by selecting a
\emph{subset} of model parameters to be updated in each training round.
However, compared to standard FL, FD produces considerably lower accuracy and
faces a longer convergence time. In this paper, we leverage \textit{coding
theory} to enhance FD …

arxiv dropout rate server

More from arxiv.org / cs.LG updates on arXiv.org

Machine Learning Product Manager (Canada, Remote)

@ FreshBooks | Canada

Data Engineer

@ Amazon.com | Irvine, California, USA

Senior Autonomy Behavior II, Performance Assessment Engineer

@ Cruise LLC | San Francisco, CA

Senior Data Analytics Engineer

@ Intercom | Dublin, Ireland

Data Analyst Intern

@ ADDX | Singapore

Data Science Analyst - Consumer

@ Yelp | London, England, United Kingdom

Senior Data Analyst - Python+Hadoop

@ Capco | India - Bengaluru

DevOps Engineer, Data Team

@ SingleStore | Hyderabad, India

Software Engineer (Machine Learning, AI Platform)

@ Phaidra | Remote

Sr. UI/UX Designer - Artificial Intelligence (ID:1213)

@ Truelogic Software | Remote, anywhere in LATAM

Analytics Engineer

@ carwow | London, England, United Kingdom

HRIS Data Analyst

@ SecurityScorecard | Remote