May 24, 2022, 6:34 p.m. | Daniel Reedstone

Towards Data Science - Medium towardsdatascience.com

Perform rapid loss-function prototypes to take full advantage of XGBoost’s flexibility

Photo by Matt Artz on Unsplash

Motivation

Running XGBoost with custom loss functions can greatly increase classification/regression performance in certain applications. Being able to quickly test many different loss functions is key in time-critical research environments. Thus, manual differentiation is not always feasible (and sometimes even prone to human errors, or numerical instability).

Automatic differentiation allows us to automatically get the derivatives of a function, given its calculation. It …

differentiation jax loss-function machine learning pytorch xgboost

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York