Nov. 14, 2022, 2:12 a.m. | Kilean Hwang, Tomofumi Maruta, Alexander Plastun, Kei Fukushima, Tong Zhang, Qiang Zhao, Peter Ostroumov, Yue Hao

cs.LG updates on arXiv.org arxiv.org

Bayesian optimization~(BO) is often used for accelerator tuning due to its
high sample efficiency. However, the computational scalability of training over
large data-set can be problematic and the adoption of historical data in a
computationally efficient way is not trivial. Here, we exploit a neural network
model trained over historical data as a prior mean of BO for FRIB Front-End
tuning.

application arxiv bayesian front-end mean optimization physics prior

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Technology Consultant Master Data Management (w/m/d)

@ SAP | Walldorf, DE, 69190

Research Engineer, Computer Vision, Google Research

@ Google | Nairobi, Kenya