Web: http://arxiv.org/abs/2205.04855

May 11, 2022, 1:11 a.m. | Mohamed Ridha Znaidi, Gaurav Gupta, Paul Bogdan

cs.LG updates on arXiv.org arxiv.org

Decentralized learning is an efficient emerging paradigm for boosting the
computing capability of multiple bounded computing agents. In the big data era,
performing inference within the distributed and federated learning (DL and FL)
frameworks, the central server needs to process a large amount of data while
relying on various agents to perform multiple distributed training tasks.
Considering the decentralized computing topology, privacy has become a
first-class concern. Moreover, assuming limited information processing
capability for the agents calls for a sophisticated …

arxiv distributed federated learning learning prediction privacy trade

More from arxiv.org / cs.LG updates on arXiv.org

Director, Applied Mathematics & Computational Research Division

@ Lawrence Berkeley National Lab | Berkeley, Ca

Business Data Analyst

@ MainStreet Family Care | Birmingham, AL

Assistant/Associate Professor of the Practice in Business Analytics

@ Georgetown University McDonough School of Business | Washington DC

Senior Data Science Writer

@ NannyML | Remote

Director of AI/ML Engineering

@ Armis Industries | Remote (US only), St. Louis, California

Digital Analytics Manager

@ Patagonia | Ventura, California