May 2, 2022, 1:12 a.m. | Pedro Mendes, Maria Casimiro, Paolo Romano, David Garlan

cs.LG updates on arXiv.org arxiv.org

In the literature on hyper-parameter tuning, a number of recent solutions
rely on low-fidelity observations (e.g., training with sub-sampled datasets or
for short periods of time) to extrapolate good configurations to use when
performing full training. Among these, HyperBand is arguably one of the most
popular solutions, due to its efficiency and theoretically provable robustness.
In this work, we introduce HyperJump, a new approach that builds on HyperBand's
robust search strategy and complements it with novel model-based risk analysis
techniques …

arxiv modelling risk

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Program Control Data Analyst

@ Ford Motor Company | Mexico

Vice President, Business Intelligence / Data & Analytics

@ AlphaSense | Remote - United States