all AI news
Bias-Variance Decompositions for Margin Losses. (arXiv:2204.12155v1 [stat.ML])
April 27, 2022, 1:11 a.m. | Danny Wood, Tingting Mu, Gavin Brown
cs.LG updates on arXiv.org arxiv.org
We introduce a novel bias-variance decomposition for a range of strictly
convex margin losses, including the logistic loss (minimized by the classic
LogitBoost algorithm), as well as the squared margin loss and canonical
boosting loss. Furthermore, we show that, for all strictly convex margin
losses, the expected risk decomposes into the risk of a "central" model and a
term quantifying variation in the functional margin with respect to variations
in the training data. These decompositions provide a diagnostic tool for …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Data Analyst (Digital Business Analyst)
@ Activate Interactive Pte Ltd | Singapore, Central Singapore, Singapore