all AI news
Private Stochastic Optimization in the Presence of Outliers: Optimal Rates for (Non-Smooth) Convex Losses and Extension to Non-Convex Losses. (arXiv:2209.07403v1 [cs.LG])
Sept. 16, 2022, 1:13 a.m. | Andrew Lowy, Meisam Razaviyayn
stat.ML updates on arXiv.org arxiv.org
We study differentially private (DP) stochastic optimization (SO) with data
containing outliers and loss functions that are not Lipschitz continuous. To
date, the vast majority of work on DP SO assumes that the loss is Lipschitz
(i.e. stochastic gradients are uniformly bounded), and their error bounds scale
with the Lipschitz parameter of the loss. While this assumption is convenient,
it is often unrealistic: in many practical problems where privacy is required,
data may contain outliers or be unbounded, causing some …
More from arxiv.org / stat.ML updates on arXiv.org
Estimation Sample Complexity of a Class of Nonlinear Continuous-time Systems
2 days, 23 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Program Control Data Analyst
@ Ford Motor Company | Mexico
Vice President, Business Intelligence / Data & Analytics
@ AlphaSense | Remote - United States