all AI news
Differential Privacy of Noisy (S)GD under Heavy-Tailed Perturbations
March 5, 2024, 2:44 p.m. | Umut \c{S}im\c{s}ekli, Mert G\"urb\"uzbalaban, Sinan Y{\i}ld{\i}r{\i}m, Lingjiong Zhu
cs.LG updates on arXiv.org arxiv.org
Abstract: Injecting heavy-tailed noise to the iterates of stochastic gradient descent (SGD) has received increasing attention over the past few years. While various theoretical properties of the resulting algorithm have been analyzed mainly from learning theory and optimization perspectives, their privacy preservation properties have not yet been established. Aiming to bridge this gap, we provide differential privacy (DP) guarantees for noisy SGD, when the injected noise follows an $\alpha$-stable distribution, which includes a spectrum of heavy-tailed …
abstract algorithm arxiv attention cs.cr cs.lg differential differential privacy gradient math.st noise optimization perspectives preservation privacy stat.ml stat.th stochastic theory type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Intern - Robotics Industrial Engineer Summer 2024
@ Vitesco Technologies | Seguin, US