all AI news
Multi-objective hyperparameter optimization with performance uncertainty. (arXiv:2209.04340v1 [cs.LG])
Sept. 12, 2022, 1:11 a.m. | Alejandro Morales-Hernández, Inneke Van Nieuwenhuyse, Gonzalo Nápoles
cs.LG updates on arXiv.org arxiv.org
The performance of any Machine Learning (ML) algorithm is impacted by the
choice of its hyperparameters. As training and evaluating a ML algorithm is
usually expensive, the hyperparameter optimization (HPO) method needs to be
computationally efficient to be useful in practice. Most of the existing
approaches on multi-objective HPO use evolutionary strategies and
metamodel-based optimization. However, few methods have been developed to
account for uncertainty in the performance measurements. This paper presents
results on multi-objective hyperparameter optimization with uncertainty on …
More from arxiv.org / cs.LG updates on arXiv.org
A Single-Loop Algorithm for Decentralized Bilevel Optimization
1 day, 6 hours ago |
arxiv.org
CLEANing Cygnus A deep and fast with R2D2
1 day, 6 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Alternant Data Engineering
@ Aspire Software | Angers, FR
Senior Software Engineer, Generative AI
@ Google | Dublin, Ireland