all AI news
BEER: Fast $O(1/T)$ Rate for Decentralized Nonconvex Optimization with Communication Compression. (arXiv:2201.13320v2 [cs.LG] UPDATED)
cs.LG updates on arXiv.org arxiv.org
Communication efficiency has been widely recognized as the bottleneck for
large-scale decentralized machine learning applications in multi-agent or
federated environments. To tackle the communication bottleneck, there have been
many efforts to design communication-compressed algorithms for decentralized
nonconvex optimization, where the clients are only allowed to communicate a
small amount of quantized information (aka bits) with their neighbors over a
predefined graph topology. Despite significant efforts, the state-of-the-art
algorithm in the nonconvex setting still suffers from a slower rate of
convergence …
arxiv beer communication compression decentralized optimization rate