May 12, 2022, 1:11 a.m. | Xiaoyun Li, Belhal Karimi, Ping Li

cs.LG updates on arXiv.org arxiv.org

We study COMP-AMS, a distributed optimization framework based on gradient
averaging and adaptive AMSGrad algorithm. Gradient compression with error
feedback is applied to reduce the communication cost in the gradient
transmission process. Our convergence analysis of COMP-AMS shows that such
compressed gradient averaging strategy yields same convergence rate as standard
AMSGrad, and also exhibits the linear speedup effect w.r.t. the number of local
workers. Compared with recently proposed protocols on distributed adaptive
methods, COMP-AMS is simple and convenient. Numerical experiments …

arxiv compression distributed gradient ml optimization

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Business Intelligence Analyst

@ Rappi | COL-Bogotá

Applied Scientist II

@ Microsoft | Redmond, Washington, United States