all AI news
Shuffling Momentum Gradient Algorithm for Convex Optimization
March 6, 2024, 5:42 a.m. | Trang H. Tran, Quoc Tran-Dinh, Lam M. Nguyen
cs.LG updates on arXiv.org arxiv.org
Abstract: The Stochastic Gradient Descent method (SGD) and its stochastic variants have become methods of choice for solving finite-sum optimization problems arising from machine learning and data science thanks to their ability to handle large-scale applications and big datasets. In the last decades, researchers have made substantial effort to study the theoretical performance of SGD and its shuffling variants. However, only limited work has investigated its shuffling momentum variants, including shuffling heavy-ball momentum schemes for non-convex …
abstract algorithm applications arxiv become big cs.lg data data science datasets gradient machine machine learning machine learning and data science math.oc optimization researchers scale science stochastic type variants
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Lead Data Scientist, Commercial Analytics
@ Checkout.com | London, United Kingdom
Data Engineer I
@ Love's Travel Stops | Oklahoma City, OK, US, 73120