all AI news
Compressed Gradient Tracking for Decentralized Optimization Over General Directed Networks. (arXiv:2106.07243v3 [math.OC] UPDATED)
cs.LG updates on arXiv.org arxiv.org
In this paper, we propose two communication efficient decentralized
optimization algorithms over a general directed multi-agent network. The first
algorithm, termed Compressed Push-Pull (CPP), combines the gradient tracking
Push-Pull method with communication compression. We show that CPP is applicable
to a general class of unbiased compression operators and achieves linear
convergence rate for strongly convex and smooth objective functions. The second
algorithm is a broadcast-like version of CPP (B-CPP), and it also achieves
linear convergence rate under the same conditions …
arxiv decentralized general gradient math networks optimization tracking