all AI news
VC Theoretical Explanation of Double Descent. (arXiv:2205.15549v3 [stat.ML] UPDATED)
Sept. 30, 2022, 1:14 a.m. | Eng Hock Lee, Vladimir Cherkassky
stat.ML updates on arXiv.org arxiv.org
There has been growing interest in generalization performance of large
multilayer neural networks that can be trained to achieve zero training error,
while generalizing well on test data. This regime is known as 'second descent'
and it appears to contradict the conventional view that optimal model
complexity should reflect an optimal balance between underfitting and
overfitting, i.e., the bias-variance trade-off. This paper presents a
VC-theoretical analysis of double descent and shows that it can be fully
explained by classical VC-generalization …
More from arxiv.org / stat.ML updates on arXiv.org
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Lead Software Engineer - Artificial Intelligence, LLM
@ OpenText | Hyderabad, TG, IN
Lead Software Engineer- Python Data Engineer
@ JPMorgan Chase & Co. | GLASGOW, LANARKSHIRE, United Kingdom
Data Analyst (m/w/d)
@ Collaboration Betters The World | Berlin, Germany
Data Engineer, Quality Assurance
@ Informa Group Plc. | Boulder, CO, United States
Director, Data Science - Marketing
@ Dropbox | Remote - Canada