Oct. 13, 2022, 1:15 a.m. | Fusheng Liu, Haizhao Yang, Soufiane Hayou, Qianxiao Li

stat.ML updates on arXiv.org arxiv.org

Optimization and generalization are two essential aspects of statistical
machine learning. In this paper, we propose a framework to connect optimization
with generalization by analyzing the generalization error based on the
optimization trajectory under the gradient flow algorithm. The key ingredient
of this framework is the Uniform-LGI, a property that is generally satisfied
when training machine learning models. Leveraging the Uniform-LGI, we first
derive convergence rates for gradient flow algorithm, then we give
generalization bounds for a large class of …

arxiv dynamics gradient inequality optimization

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Stagista Technical Data Engineer

@ Hager Group | BRESCIA, IT

Data Analytics - SAS, SQL - Associate

@ JPMorgan Chase & Co. | Mumbai, Maharashtra, India