all AI news
Triangulation candidates for Bayesian optimization. (arXiv:2112.07457v2 [stat.CO] UPDATED)
May 23, 2022, 1:11 a.m. | Robert B. Gramacy, Annie Sauer, Nathan Wycoff
stat.ML updates on arXiv.org arxiv.org
Bayesian optimization involves "inner optimization" over a new-data
acquisition criterion which is non-convex/highly multi-modal, may be
non-differentiable, or may otherwise thwart local numerical optimizers. In such
cases it is common to replace continuous search with a discrete one over random
candidates. Here we propose using candidates based on a Delaunay triangulation
of the existing input design. We detail the construction of these "tricands"
and demonstrate empirically how they outperform both numerically optimized
acquisitions and random candidate-based alternatives, and are well-suited …
More from arxiv.org / stat.ML updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Social Insights & Data Analyst (Freelance)
@ Media.Monks | Jakarta
Cloud Data Engineer
@ Arkatechture | Portland, ME, USA