all AI news
Convergence Analysis of Stochastic Gradient Descent with MCMC Estimators
March 26, 2024, 4:49 a.m. | Tianyou Li, Fan Chen, Huajie Chen, Zaiwen Wen
stat.ML updates on arXiv.org arxiv.org
Abstract: Understanding stochastic gradient descent (SGD) and its variants is essential for machine learning. However, most of the preceding analyses are conducted under amenable conditions such as unbiased gradient estimator and bounded objective functions, which does not encompass many sophisticated applications, such as variational Monte Carlo, entropy-regularized reinforcement learning and variational inference. In this paper, we consider the SGD algorithm that employ the Markov Chain Monte Carlo (MCMC) estimator to compute the gradient, called MCMC-SGD. Since …
abstract analysis applications arxiv convergence entropy estimator functions gradient however machine machine learning math.oc mcmc stat.ml stochastic type unbiased understanding variants
More from arxiv.org / stat.ML updates on arXiv.org
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US