March 14, 2024, 4:43 a.m. | Puya Latafat, Andreas Themelis, Lorenzo Stella, Panagiotis Patrinos

cs.LG updates on arXiv.org arxiv.org

arXiv:2301.04431v4 Announce Type: replace-cross
Abstract: Backtracking linesearch is the de facto approach for minimizing continuously differentiable functions with locally Lipschitz gradient. In recent years, it has been shown that in the convex setting it is possible to avoid linesearch altogether, and to allow the stepsize to adapt based on a local smoothness estimate without any backtracks or evaluations of the function value. In this work we propose an adaptive proximal gradient method, adaPG, that uses novel estimates of the local …

abstract adapt algorithms arxiv backtracking continuity cs.lg differentiable functions gradient math.oc optimization type

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Data Engineer - Takealot Group (Takealot.com | Superbalist.com | Mr D Food)

@ takealot.com | Cape Town