all AI news
Efficient Randomized Subspace Embeddings for Distributed Optimization under a Communication Budget. (arXiv:2103.07578v4 [cs.LG] UPDATED)
Aug. 17, 2022, 1:10 a.m. | Rajarshi Saha, Mert Pilanci, Andrea J. Goldsmith
cs.LG updates on arXiv.org arxiv.org
We study first-order optimization algorithms under the constraint that the
descent direction is quantized using a pre-specified budget of $R$-bits per
dimension, where $R \in (0 ,\infty)$. We propose computationally efficient
optimization algorithms with convergence rates matching the
information-theoretic performance lower bounds for: (i) Smooth and
Strongly-Convex objectives with access to an Exact Gradient oracle, as well as
(ii) General Convex and Non-Smooth objectives with access to a Noisy
Subgradient oracle. The crux of these algorithms is a polynomial complexity …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Praktikum im Bereich eMobility / Charging Solutions - Data Analysis
@ Bosch Group | Stuttgart, Germany
Business Data Analyst
@ PartnerRe | Toronto, ON, Canada
Machine Learning/DevOps Engineer II
@ Extend | Remote, United States
Business Intelligence Developer, Marketing team (Bangkok based, relocation provided)
@ Agoda | Bangkok (Central World)