April 2, 2024, 7:44 p.m. | Elizaveta Demyanenko, Christoph Feinauer, Enrico M. Malatesta, Luca Saglietti

cs.LG updates on arXiv.org arxiv.org

arXiv:2401.12610v2 Announce Type: replace
Abstract: Recent works demonstrated the existence of a double-descent phenomenon for the generalization error of neural networks, where highly overparameterized models escape overfitting and achieve good test performance, at odds with the standard bias-variance trade-off described by statistical learning theory. In the present work, we explore a link between this phenomenon and the increase of complexity and sensitivity of the function represented by neural networks. In particular, we study the Boolean mean dimension (BMD), a metric …

abstract arxiv bias bias-variance cond-mat.dis-nn cs.lg error explore good math.pr math.st networks neural networks overfitting performance standard statistical stat.th test theory trade trade-off twin type variance work

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US