all AI news
Distributed Proximal Splitting Algorithms with Rates and Acceleration. (arXiv:2010.00952v3 [math.OC] UPDATED)
Web: http://arxiv.org/abs/2010.00952
Jan. 28, 2022, 2:11 a.m. | Laurent Condat, Grigory Malinovsky, Peter Richtárik
cs.LG updates on arXiv.org arxiv.org
We analyze several generic proximal splitting algorithms well suited for
large-scale convex nonsmooth optimization. We derive sublinear and linear
convergence results with new rates on the function value suboptimality or
distance to the solution, as well as new accelerated versions, using varying
stepsizes. In addition, we propose distributed variants of these algorithms,
which can be accelerated as well. While most existing results are ergodic, our
nonergodic results significantly broaden our understanding of primal-dual
optimization algorithms.
More from arxiv.org / cs.LG updates on arXiv.org
Latest AI/ML/Big Data Jobs
Data Analyst, Credit Risk
@ Stripe | US Remote
Senior Data Engineer
@ Snyk | Cluj, Romania, or Remote
Senior Software Engineer (C++), Autonomy Visualization
@ Nuro, Inc. | Mountain View, California (HQ)
Machine Learning Intern (January 2023)
@ Cohere | Toronto, Palo Alto, San Francisco, London
Senior Machine Learning Engineer, Reinforcement Learning, Personalization
@ Spotify | New York, NY
AWS Data Engineer
@ ProCogia | Seattle