April 24, 2024, 4:43 a.m. | Mohammad Afzali, Hassan Ashtiani, Christopher Liaw

cs.LG updates on arXiv.org arxiv.org

arXiv:2309.03847v3 Announce Type: replace-cross
Abstract: We study the problem of estimating mixtures of Gaussians under the constraint of differential privacy (DP). Our main result is that $\text{poly}(k,d,1/\alpha,1/\varepsilon,\log(1/\delta))$ samples are sufficient to estimate a mixture of $k$ Gaussians in $\mathbb{R}^d$ up to total variation distance $\alpha$ while satisfying $(\varepsilon, \delta)$-DP. This is the first finite sample complexity upper bound for the problem that does not make any structural assumptions on the GMMs.
To solve the problem, we devise a new framework …

abstract alpha arxiv cs.cr cs.ds cs.it cs.lg delta differential differential privacy math.it polynomial privacy samples stat.ml study text total type variation

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Machine Learning Engineer

@ Apple | Sunnyvale, California, United States