all AI news
Understanding Benign Overfitting in Nested Meta Learning. (arXiv:2206.13482v1 [cs.LG] CROSS LISTED)
June 29, 2022, 1:11 a.m. | Lisha Chen, Songtao Lu, Tianyi Chen
stat.ML updates on arXiv.org arxiv.org
Meta learning has demonstrated tremendous success in few-shot learning with
limited supervised data. In those settings, the meta model is usually
overparameterized. While the conventional statistical learning theory suggests
that overparameterized models tend to overfit, empirical evidence reveals that
overparameterized meta learning methods still work well -- a phenomenon often
called ``benign overfitting.'' To understand this phenomenon, we focus on the
meta learning settings with a challenging nested structure that we term the
nested meta learning, and analyze its generalization …
More from arxiv.org / stat.ML updates on arXiv.org
Jobs in AI, ML, Big Data
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Data Engineer
@ Chubb | Simsbury, CT, United States
Research Analyst , NA Light Vehicle Powertrain Forecasting
@ S&P Global | US - MI - VIRTUAL
Sr. Data Scientist - ML Ops Job
@ Yash Technologies | Indore, IN
Alternance-Data Management
@ Keolis | Courbevoie, FR, 92400