all AI news
Dynamic Memory Based Adaptive Optimization
Feb. 26, 2024, 5:42 a.m. | Bal\'azs Szegedy, Domonkos Czifra, P\'eter K\H{o}r\"osi-Szab\'o
cs.LG updates on arXiv.org arxiv.org
Abstract: Define an optimizer as having memory $k$ if it stores $k$ dynamically changing vectors in the parameter space. Classical SGD has memory $0$, momentum SGD optimizer has $1$ and Adam optimizer has $2$. We address the following questions: How can optimizers make use of more memory units? What information should be stored in them? How to use them for the learning steps? As an approach to the last question, we introduce a general method called …
abstract adam arxiv cs.ai cs.lg dynamic information math.oc memory optimization questions space stores type units vectors
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Tableau/PowerBI Developer (A.Con)
@ KPMG India | Bengaluru, Karnataka, India
Software Engineer, Backend - Data Platform (Big Data Infra)
@ Benchling | San Francisco, CA