all AI news
Lazy Queries Can Reduce Variance in Zeroth-order Optimization. (arXiv:2206.07126v1 [cs.LG])
Web: http://arxiv.org/abs/2206.07126
June 16, 2022, 1:10 a.m. | Quan Xiao, Qing Ling, Tianyi Chen
cs.LG updates on arXiv.org arxiv.org
A major challenge of applying zeroth-order (ZO) methods is the high query
complexity, especially when queries are costly. We propose a novel gradient
estimation technique for ZO methods based on adaptive lazy queries that we term
as LAZO. Different from the classic one-point or two-point gradient estimation
methods, LAZO develops two alternative ways to check the usefulness of old
queries from previous iterations, and then adaptively reuses them to construct
the low-variance gradient estimates. We rigorously establish that through
judiciously …
More from arxiv.org / cs.LG updates on arXiv.org
Latest AI/ML/Big Data Jobs
Machine Learning Researcher - Saalfeld Lab
@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia
Project Director, Machine Learning in US Health
@ ideas42.org | Remote, US
Data Science Intern
@ NannyML | Remote
Machine Learning Engineer NLP/Speech
@ Play.ht | Remote
Research Scientist, 3D Reconstruction
@ Yembo | Remote, US
Clinical Assistant or Associate Professor of Management Science and Systems
@ University at Buffalo | Buffalo, NY