Web: http://arxiv.org/abs/2201.11815

Jan. 31, 2022, 2:11 a.m. | Katarzyna Woźnica, Mateusz Grzyb, Zuzanna Trafas, Przemysław Biecek

cs.LG updates on arXiv.org arxiv.org

For many machine learning models, a choice of hyperparameters is a crucial
step towards achieving high performance. Prevalent meta-learning approaches
focus on obtaining good hyperparameters configurations with a limited
computational budget for a completely new task based on the results obtained
from the prior tasks. This paper proposes a new formulation of the tuning
problem, called consolidated learning, more suited to practical challenges
faced by model developers, in which a large number of predictive models are
created on similar data …

arxiv learning model optimization strategy xgboost

More from arxiv.org / cs.LG updates on arXiv.org

Director, Data Science (Advocacy & Nonprofit)

@ Civis Analytics | Remote

Data Engineer

@ Rappi | [CO] Bogotá

Data Scientist V, Marketplaces Personalization (Remote)

@ ID.me | United States (U.S.)

Product OPs Data Analyst (Flex/Remote)

@ Scaleway | Paris

Big Data Engineer

@ Risk Focus | Riga, Riga, Latvia

Internship Program: Machine Learning Backend

@ Nextail | Remote job