all AI news
CAGES: Cost-Aware Gradient Entropy Search for Efficient Local Multi-Fidelity Bayesian Optimization
May 14, 2024, 4:42 a.m. | Wei-Ting Tang, Joel A. Paulson
cs.LG updates on arXiv.org arxiv.org
Abstract: Bayesian optimization (BO) is a popular approach for optimizing expensive-to-evaluate black-box objective functions. An important challenge in BO is its application to high-dimensional search spaces due in large part to the curse of dimensionality. One way to overcome this challenge is to focus on local BO methods that aim to efficiently learn gradients, which have shown strong empirical performance on a variety of high-dimensional problems including policy search in reinforcement learning (RL). However, current local …
abstract application arxiv bayesian box challenge cost cs.lg dimensionality entropy fidelity focus functions gradient optimization part popular search spaces stat.ml the curse of dimensionality type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer
@ GPTZero | Toronto, Canada
ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)
@ HelloBetter | Remote
Doctoral Researcher (m/f/div) in Automated Processing of Bioimages
@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena
Seeking Developers and Engineers for AI T-Shirt Generator Project
@ Chevon Hicks | Remote
Principal Autonomy Applications
@ BHP | Chile
Quant Analytics Associate - Data Visualization
@ JPMorgan Chase & Co. | Bengaluru, Karnataka, India