all AI news
Differentially Private Coordinate Descent for Composite Empirical Risk Minimization. (arXiv:2110.11688v2 [cs.LG] UPDATED)
Web: http://arxiv.org/abs/2110.11688
Jan. 31, 2022, 2:11 a.m. | Paul Mangold, Aurélien Bellet, Joseph Salmon, Marc Tommasi
cs.LG updates on arXiv.org arxiv.org
Machine learning models can leak information about the data used to train
them. To mitigate this issue, Differentially Private (DP) variants of
optimization algorithms like Stochastic Gradient Descent (DP-SGD) have been
designed to trade-off utility for privacy in Empirical Risk Minimization (ERM)
problems. In this paper, we propose Differentially Private proximal Coordinate
Descent (DP-CD), a new method to solve composite DP-ERM problems. We derive
utility guarantees through a novel theoretical analysis of inexact coordinate
descent. Our results show that, thanks …
More from arxiv.org / cs.LG updates on arXiv.org
Latest AI/ML/Big Data Jobs
Machine Learning Product Manager (Europe, Remote)
@ FreshBooks | Germany
Field Operations and Data Engineer, ADAS
@ Lucid Motors | Newark, CA
Machine Learning Engineer - Senior
@ Novetta | Reston, VA
Analytics Engineer
@ ThirdLove | Remote
Senior Machine Learning Infrastructure Engineer - Safety
@ Discord | San Francisco, CA or Remote
Internship, Data Scientist
@ Everstream Analytics | United States (Remote)