all AI news
A Quadrature Approach for General-Purpose Batch Bayesian Optimization via Probabilistic Lifting
April 19, 2024, 4:41 a.m. | Masaki Adachi, Satoshi Hayakawa, Martin J{\o}rgensen, Saad Hamid, Harald Oberhauser, Michael A. Osborne
cs.LG updates on arXiv.org arxiv.org
Abstract: Parallelisation in Bayesian optimisation is a common strategy but faces several challenges: the need for flexibility in acquisition functions and kernel choices, flexibility dealing with discrete and continuous variables simultaneously, model misspecification, and lastly fast massive parallelisation. To address these challenges, we introduce a versatile and modular framework for batch Bayesian optimisation via probabilistic lifting with kernel quadrature, called SOBER, which we present as a Python library based on GPyTorch/BoTorch. Our framework offers the following …
abstract acquisition arxiv bayesian challenges continuous cs.lg cs.na flexibility functions general kernel massive math.na optimisation optimization stat.ml strategy type variables via
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Senior Data Engineer
@ Quantexa | Sydney, New South Wales, Australia
Staff Analytics Engineer
@ Warner Bros. Discovery | NY New York 230 Park Avenue South