all AI news
Risk Implications of Excessive Multiple Local Minima during Hyperparameter Tuning
Oct. 12, 2022, 7:31 p.m. | Michio Suginoo
Towards Data Science - Medium towardsdatascience.com
Our Epistemological Limitation and Illusion of Knowledge
3D visualization with Matplotlib’s plot_trisurf: Produced by Michio SuginooExcessive multiple local minima during hyperparameter tuning is a symptom of a highly sensitive model performance to small changes in the value of hyperparameter, as displayed in the chart above.
I encountered this very rugged performance landscape with multiple dips and bumps when I was performing the grid-search tuning on the hyperparameter pair, reg_alpha and reg_lambda, of the native XGBoost API. It was just …
bias-variance-tradeoff hyperparameter hyperparameter-tuning machine learning model optimization risk xgboost
More from towardsdatascience.com / Towards Data Science - Medium
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Senior Software Engineer, Generative AI (C++)
@ SoundHound Inc. | Toronto, Canada