Oct. 21, 2022, 4:40 a.m. | Michio Suginoo

Towards Data Science - Medium towardsdatascience.com

Search Global Minimum while addressing Bias-Variance Trade-off

Photo by Markus Spiske on Unsplash

Since Boosting Machine has a tendency of overfitting, XGBoost has an intense focus on addressing bias-variance trade-off and facilitates the users to apply a variety of regularization techniques through hyperparameter tuning.

This post will walk you through the code implementation of hyperparameter tuning using the native XGBoost API to address bias-variance trade-off. The entire code of this project is posted in my Github repository.

After reading …

api deep-dives hyperparameter hyperparameter-tuning k-fold-cross-validation machine learning xgboost

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Codec Avatars Research Engineer

@ Meta | Pittsburgh, PA