Feb. 26, 2024, 5:42 a.m. | Bal\'azs Szegedy, Domonkos Czifra, P\'eter K\H{o}r\"osi-Szab\'o

cs.LG updates on arXiv.org arxiv.org

arXiv:2402.15262v1 Announce Type: new
Abstract: Define an optimizer as having memory $k$ if it stores $k$ dynamically changing vectors in the parameter space. Classical SGD has memory $0$, momentum SGD optimizer has $1$ and Adam optimizer has $2$. We address the following questions: How can optimizers make use of more memory units? What information should be stored in them? How to use them for the learning steps? As an approach to the last question, we introduce a general method called …

abstract adam arxiv cs.ai cs.lg dynamic information math.oc memory optimization questions space stores type units vectors

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US