May 20, 2022, 1:12 a.m. | Andi Han, Bamdev Mishra, Pratik Jawanpuria, Junbin Gao

cs.LG updates on arXiv.org arxiv.org

In this paper, we study the differentially private empirical risk
minimization problem where the parameter is constrained to a Riemannian
manifold. We introduce a framework of differentially private Riemannian
optimization by adding noise to the Riemannian gradient on the tangent space.
The noise follows a Gaussian distribution intrinsically defined with respect to
the Riemannian metric. We adapt the Gaussian mechanism from the Euclidean space
to the tangent space compatible to such generalized Gaussian distribution. We
show that this strategy presents …

arxiv math optimization

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Stagista Technical Data Engineer

@ Hager Group | BRESCIA, IT

Data Analytics - SAS, SQL - Associate

@ JPMorgan Chase & Co. | Mumbai, Maharashtra, India