Feb. 15, 2024, 5:44 a.m. | Jie Hu, Vishwaraj Doshi, Do Young Eun

cs.LG updates on arXiv.org arxiv.org

arXiv:2401.09339v2 Announce Type: replace-cross
Abstract: Two-timescale stochastic approximation (TTSA) is among the most general frameworks for iterative stochastic algorithms. This includes well-known stochastic optimization methods such as SGD variants and those designed for bilevel or minimax problems, as well as reinforcement learning like the family of gradient-based temporal difference (GTD) algorithms. In this paper, we conduct an in-depth asymptotic analysis of TTSA under controlled Markovian noise via central limit theorem (CLT), uncovering the coupled dynamics of TTSA influenced by the …

abstract algorithms applications approximation arxiv cs.lg family frameworks general gradient iterative math.oc minimax noise optimization reinforcement reinforcement learning stat.ml stochastic theorem theory timescale type variants

Doctoral Researcher (m/f/div) in Automated Processing of Bioimages

@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena

Seeking Developers and Engineers for AI T-Shirt Generator Project

@ Chevon Hicks | Remote

Principal Research Engineer - Materials

@ GKN Aerospace | Westlake, TX, US

Internship in Data and Projects

@ Bosch Group | Mechelen, Belgium

Research Scientist- Applied Mechanics

@ Corning | Pune, MH, IN, 410501

Product Data Analyst

@ AUTODOC | Lisbon-remote