Oct. 13, 2022, 1:12 a.m. | Hoang Tran, Ashok Cutkosky

cs.LG updates on arXiv.org arxiv.org

We introduce new algorithms and convergence guarantees for privacy-preserving
non-convex Empirical Risk Minimization (ERM) on smooth $d$-dimensional
objectives. We develop an improved sensitivity analysis of stochastic gradient
descent on smooth objectives that exploits the recurrence of examples in
different epochs. By combining this new approach with recent analysis of
momentum with private aggregation techniques, we provide an
$(\epsilon,\delta)$-differential private algorithm that finds a gradient of
norm $\tilde O\left(\frac{d^{1/3}}{(\epsilon N)^{2/3}}\right)$ in
$O\left(\frac{N^{7/3}\epsilon^{4/3}}{d^{2/3}}\right)$ gradient evaluations,
improving the previous best gradient bound of …

aggregation arxiv erm

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Lead Data Engineer

@ JPMorgan Chase & Co. | Jersey City, NJ, United States

Senior Machine Learning Engineer

@ TELUS | Vancouver, BC, CA

CT Technologist - Ambulatory Imaging - PRN

@ Duke University | Morriville, NC, US, 27560

BH Data Analyst

@ City of Philadelphia | Philadelphia, PA, United States