Oct. 19, 2022, 1:14 a.m. | Andrew Lowy, Meisam Razaviyayn

stat.ML updates on arXiv.org arxiv.org

We study differentially private (DP) stochastic optimization (SO) with data
containing outliers and loss functions that are (possibly) not Lipschitz
continuous. To date, the vast majority of work on DP SO assumes that the loss
is uniformly Lipschitz over data (i.e. stochastic gradients are uniformly
bounded over all data points). While this assumption is convenient, it is often
unrealistic: in many practical problems, the loss function may not be uniformly
Lipschitz. Even when the loss function is Lipschitz continuous, the …

arxiv extension losses optimization outliers stochastic

Data Scientist (m/f/x/d)

@ Symanto Research GmbH & Co. KG | Spain, Germany

Research Analyst

@ Cypris | Los Angeles, California, United States

Data Manager H/F

@ ASSYSTEM | Courbevoie, France

Software Engineer III - Java Scala BigData AWS

@ JPMorgan Chase & Co. | Bengaluru, Karnataka, India

Reference Data Specialist

@ JPMorgan Chase & Co. | Bengaluru, Karnataka, India

Data Visualization Manager

@ PatientPoint | Cincinnati, Ohio, United States