all AI news
Convergence Analysis of Stochastic Gradient Descent with MCMC Estimators
March 26, 2024, 4:49 a.m. | Tianyou Li, Fan Chen, Huajie Chen, Zaiwen Wen
stat.ML updates on arXiv.org arxiv.org
Abstract: Understanding stochastic gradient descent (SGD) and its variants is essential for machine learning. However, most of the preceding analyses are conducted under amenable conditions such as unbiased gradient estimator and bounded objective functions, which does not encompass many sophisticated applications, such as variational Monte Carlo, entropy-regularized reinforcement learning and variational inference. In this paper, we consider the SGD algorithm that employ the Markov Chain Monte Carlo (MCMC) estimator to compute the gradient, called MCMC-SGD. Since …
abstract analysis applications arxiv convergence entropy estimator functions gradient however machine machine learning math.oc mcmc stat.ml stochastic type unbiased understanding variants
More from arxiv.org / stat.ML updates on arXiv.org
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Data Science Analyst
@ Mayo Clinic | AZ, United States
Sr. Data Scientist (Network Engineering)
@ SpaceX | Redmond, WA