all AI news
Central Limit Theorem for Two-Timescale Stochastic Approximation with Markovian Noise: Theory and Applications
Feb. 15, 2024, 5:44 a.m. | Jie Hu, Vishwaraj Doshi, Do Young Eun
cs.LG updates on arXiv.org arxiv.org
Abstract: Two-timescale stochastic approximation (TTSA) is among the most general frameworks for iterative stochastic algorithms. This includes well-known stochastic optimization methods such as SGD variants and those designed for bilevel or minimax problems, as well as reinforcement learning like the family of gradient-based temporal difference (GTD) algorithms. In this paper, we conduct an in-depth asymptotic analysis of TTSA under controlled Markovian noise via central limit theorem (CLT), uncovering the coupled dynamics of TTSA influenced by the …
abstract algorithms applications approximation arxiv cs.lg family frameworks general gradient iterative math.oc minimax noise optimization reinforcement reinforcement learning stat.ml stochastic theorem theory timescale type variants
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Data Engineer - New Graduate
@ Applied Materials | Milan,ITA
Lead Machine Learning Scientist
@ Biogen | Cambridge, MA, United States