all AI news
Faster Gradient-Free Algorithms for Nonsmooth Nonconvex Stochastic Optimization
May 15, 2024, 4:43 a.m. | Lesi Chen, Jing Xu, Luo Luo
cs.LG updates on arXiv.org arxiv.org
Abstract: We consider the optimization problem of the form $\min_{x \in \mathbb{R}^d} f(x) \triangleq \mathbb{E}_{\xi} [F(x; \xi)]$, where the component $F(x;\xi)$ is $L$-mean-squared Lipschitz but possibly nonconvex and nonsmooth. The recently proposed gradient-free method requires at most $\mathcal{O}( L^4 d^{3/2} \epsilon^{-4} + \Delta L^3 d^{3/2} \delta^{-1} \epsilon^{-4})$ stochastic zeroth-order oracle complexity to find a $(\delta,\epsilon)$-Goldstein stationary point of objective function, where $\Delta = f(x_0) - \inf_{x \in \mathbb{R}^d} f(x)$ and $x_0$ is the initial point of …
abstract algorithms arxiv cs.lg delta epsilon faster form free gradient math.oc mean optimization replace stochastic type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)
@ HelloBetter | Remote
Doctoral Researcher (m/f/div) in Automated Processing of Bioimages
@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena
Seeking Developers and Engineers for AI T-Shirt Generator Project
@ Chevon Hicks | Remote
Security Data Engineer
@ ASML | Veldhoven, Building 08, Netherlands
Data Engineer
@ Parsons Corporation | Pune - Business Bay
Data Engineer
@ Parsons Corporation | Bengaluru, Velankani Tech Park