all AI news
Private Stochastic Optimization in the Presence of Outliers: Optimal Rates for (Non-Smooth) Convex Losses and Extension to Non-Convex Losses. (arXiv:2209.07403v2 [cs.LG] UPDATED)
Oct. 19, 2022, 1:13 a.m. | Andrew Lowy, Meisam Razaviyayn
cs.LG updates on arXiv.org arxiv.org
We study differentially private (DP) stochastic optimization (SO) with data
containing outliers and loss functions that are (possibly) not Lipschitz
continuous. To date, the vast majority of work on DP SO assumes that the loss
is uniformly Lipschitz over data (i.e. stochastic gradients are uniformly
bounded over all data points). While this assumption is convenient, it is often
unrealistic: in many practical problems, the loss function may not be uniformly
Lipschitz. Even when the loss function is Lipschitz continuous, the …
More from arxiv.org / cs.LG updates on arXiv.org
A Single-Loop Algorithm for Decentralized Bilevel Optimization
1 day, 6 hours ago |
arxiv.org
CLEANing Cygnus A deep and fast with R2D2
1 day, 6 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Alternant Data Engineering
@ Aspire Software | Angers, FR
Senior Software Engineer, Generative AI
@ Google | Dublin, Ireland