April 24, 2023, 12:46 a.m. | Shuhei Watanabe, Frank Hutter

cs.LG updates on arXiv.org arxiv.org

Hyperparameter optimization (HPO) is crucial for strong performance of deep
learning algorithms and real-world applications often impose some constraints,
such as memory usage, or latency on top of the performance requirement. In this
work, we propose constrained TPE (c-TPE), an extension of the widely-used
versatile Bayesian optimization method, tree-structured Parzen estimator (TPE),
to handle these constraints. Our proposed extension goes beyond a simple
combination of an existing acquisition function and the original TPE, and
instead includes modifications that address issues …

acquisition algorithms applications arxiv bayesian combination constraints deep learning deep learning algorithms extension function hyperparameter inequality latency memory optimization performance tree usage work world

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Program Control Data Analyst

@ Ford Motor Company | Mexico

Vice President, Business Intelligence / Data & Analytics

@ AlphaSense | Remote - United States