all AI news
AdaGossip: Adaptive Consensus Step-size for Decentralized Deep Learning with Communication Compression
April 10, 2024, 4:41 a.m. | Sai Aparna Aketi, Abolfazl Hashemi, Kaushik Roy
cs.LG updates on arXiv.org arxiv.org
Abstract: Decentralized learning is crucial in supporting on-device learning over large distributed datasets, eliminating the need for a central server. However, the communication overhead remains a major bottleneck for the practical realization of such decentralized setups. To tackle this issue, several algorithms for decentralized training with compressed communication have been proposed in the literature. Most of these algorithms introduce an additional hyper-parameter referred to as consensus step-size which is tuned based on the compression ratio at …
abstract algorithms arxiv communication compression consensus cs.lg datasets decentralized deep learning distributed however issue major on-device learning practical server training type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Senior Data Engineer
@ Quantexa | Sydney, New South Wales, Australia
Staff Analytics Engineer
@ Warner Bros. Discovery | NY New York 230 Park Avenue South