all AI news
Stochastic linear optimization never overfits with quadratically-bounded losses on general data. (arXiv:2202.06915v2 [cs.LG] UPDATED)
June 29, 2022, 1:11 a.m. | Matus Telgarsky
stat.ML updates on arXiv.org arxiv.org
This work provides test error bounds for iterative fixed point methods on
linear predictors -- specifically, stochastic and batch mirror descent (MD),
and stochastic temporal difference learning (TD) -- with two core
contributions: (a) a single proof technique which gives high probability
guarantees despite the absence of projections, regularization, or any
equivalents, even when optima have large or infinite norm, for
quadratically-bounded losses (e.g., providing unified treatment of squared and
logistic losses); (b) locally-adapted rates which depend not on global …
More from arxiv.org / stat.ML updates on arXiv.org
Jobs in AI, ML, Big Data
Senior ML Researcher - 3D Geometry Processing | 3D Shape Generation | 3D Mesh Data
@ Promaton | Europe
Senior Data Analyst - SQL
@ Experian | Heredia, Costa Rica
Lead Business Intelligence Developer
@ L.A. Care Health Plan | Los Angeles, CA, US, 90017
(USA) Senior Manager, Data Analytics
@ Walmart | (USA) AR BENTONVILLE Home Office J Street Offices, Suite #2
Autonomous Haulage System Application Specialist
@ Komatsu | Belo Horizonte, BR
Machine Learning Engineer
@ GFT Technologies | Alcobendas, M, ES, 28108