Oct. 20, 2022, 1:12 a.m. | Junyuan Hong, Zhangyang Wang, Jiayu Zhou

cs.LG updates on arXiv.org arxiv.org

Protecting privacy in learning while maintaining the model performance has
become increasingly critical in many applications that involve sensitive data.
Private Gradient Descent (PGD) is a commonly used private learning framework,
which noises gradients based on the Differential Privacy protocol. Recent
studies show that \emph{dynamic privacy schedules} of decreasing noise
magnitudes can improve loss at the final iteration, and yet theoretical
understandings of the effectiveness of such schedules and their connections to
optimization algorithms remain limited. In this paper, we …

arxiv budget data efficiency gradient privacy

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Stagista Technical Data Engineer

@ Hager Group | BRESCIA, IT

Data Analytics - SAS, SQL - Associate

@ JPMorgan Chase & Co. | Mumbai, Maharashtra, India