Jan. 4, 2022, 2:10 a.m. | Hao Wang, Rui Gao, Flavio P. Calmon

cs.LG updates on arXiv.org arxiv.org

Machine learning models trained by different optimization algorithms under
different data distributions can exhibit distinct generalization behaviors. In
this paper, we analyze the generalization of models trained by noisy iterative
algorithms. We derive distribution-dependent generalization bounds by
connecting noisy iterative algorithms to additive noise channels found in
communication and information theory. Our generalization bounds shed light on
several applications, including differentially private stochastic gradient
descent (DP-SGD), federated learning, and stochastic gradient Langevin dynamics
(SGLD). We demonstrate our bounds through numerical …

algorithms arxiv ml noise

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Engineer - Data Science Operations

@ causaLens | London - Hybrid, England, United Kingdom

F0138 - LLM Developer (AI NLP)

@ Ubiquiti Inc. | Taipei

Staff Engineer, Database

@ Nagarro | Gurugram, India

Artificial Intelligence Assurance Analyst

@ Booz Allen Hamilton | USA, VA, McLean (8251 Greensboro Dr)