all AI news
JAX vs PyTorch: Automatic Differentiation for XGBoost
May 24, 2022, 6:34 p.m. | Daniel Reedstone
Towards Data Science - Medium towardsdatascience.com
Perform rapid loss-function prototypes to take full advantage of XGBoost’s flexibility
Photo by Matt Artz on UnsplashMotivation
Running XGBoost with custom loss functions can greatly increase classification/regression performance in certain applications. Being able to quickly test many different loss functions is key in time-critical research environments. Thus, manual differentiation is not always feasible (and sometimes even prone to human errors, or numerical instability).
Automatic differentiation allows us to automatically get the derivatives of a function, given its calculation. It …
differentiation jax loss-function machine learning pytorch xgboost
More from towardsdatascience.com / Towards Data Science - Medium
Are Data Scientists Fortune Tellers?
1 day, 2 hours ago |
towardsdatascience.com
Phi-3 and the Beginning of Highly Performant iPhone Models
1 day, 2 hours ago |
towardsdatascience.com
Jobs in AI, ML, Big Data
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York