April 11, 2024, midnight | /u/WhiteRaven_M

Data Science www.reddit.com

I get it; XGBoost is really potent and easy to use while with DL theres a lot more that can go wrong tuning hyper parameters wise.

But i always assumed that whatever an ML model can do, a DL model with proper settings and sufficient regularization can also do as well even in low-medium size datsets (~hundreds to thoussnds of examples range).

I understand that DL models are more likely to overfit because theyre very very flexible espeically especially as …

datascience datasets easy medium networks neural networks parameters small tabular wise xgboost

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Business Intelligence Manager

@ Sanofi | Budapest

Principal Engineer, Data (Hybrid)

@ Homebase | Toronto, Ontario, Canada