Oct. 19, 2022, 1:13 a.m. | Andrew Lowy, Meisam Razaviyayn

cs.LG updates on arXiv.org arxiv.org

We study differentially private (DP) stochastic optimization (SO) with data
containing outliers and loss functions that are (possibly) not Lipschitz
continuous. To date, the vast majority of work on DP SO assumes that the loss
is uniformly Lipschitz over data (i.e. stochastic gradients are uniformly
bounded over all data points). While this assumption is convenient, it is often
unrealistic: in many practical problems, the loss function may not be uniformly
Lipschitz. Even when the loss function is Lipschitz continuous, the …

arxiv extension losses optimization outliers stochastic

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Alternant Data Engineering

@ Aspire Software | Angers, FR

Senior Software Engineer, Generative AI

@ Google | Dublin, Ireland