April 27, 2022, 1:12 a.m. | Konstantinos E. Nikolakakis, Farzin Haddadpour, Amin Karbasi, Dionysios S. Kalogerias

cs.LG updates on arXiv.org arxiv.org

We provide sharp path-dependent generalization and excess error guarantees
for the full-batch Gradient Decent (GD) algorithm for smooth losses (possibly
non-Lipschitz, possibly nonconvex). At the heart of our analysis is a novel
generalization error technique for deterministic symmetric algorithms, that
implies average output stability and a bounded expected gradient of the loss at
termination leads to generalization. This key result shows that small
generalization error occurs at stationary points, and allows us to bypass
Lipschitz assumptions on the loss prevalent …

arxiv ml risk

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Data Management Associate

@ EcoVadis | Ebène, Mauritius

Senior Data Engineer

@ Telstra | Telstra ICC Bengaluru