Sept. 16, 2022, 1:12 a.m. | Andrew Lowy, Meisam Razaviyayn

cs.LG updates on arXiv.org arxiv.org

We study differentially private (DP) stochastic optimization (SO) with data
containing outliers and loss functions that are not Lipschitz continuous. To
date, the vast majority of work on DP SO assumes that the loss is Lipschitz
(i.e. stochastic gradients are uniformly bounded), and their error bounds scale
with the Lipschitz parameter of the loss. While this assumption is convenient,
it is often unrealistic: in many practical problems where privacy is required,
data may contain outliers or be unbounded, causing some …

arxiv extension losses optimization outliers stochastic

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Data Management Assistant

@ World Vision | Amman Office, Jordan

Cloud Data Engineer, Global Services Delivery, Google Cloud

@ Google | Buenos Aires, Argentina