all AI news
SoK: A Review of Differentially Private Linear Models For High-Dimensional Data
April 2, 2024, 7:42 p.m. | Amol Khanna, Edward Raff, Nathan Inkawhich
cs.LG updates on arXiv.org arxiv.org
Abstract: Linear models are ubiquitous in data science, but are particularly prone to overfitting and data memorization in high dimensions. To guarantee the privacy of training data, differential privacy can be used. Many papers have proposed optimization techniques for high-dimensional differentially private linear models, but a systematic comparison between these methods does not exist. We close this gap by providing a comprehensive review of optimization methods for private high-dimensional linear models. Empirical tests on all methods …
abstract arxiv cs.cr cs.lg data data science differential differential privacy dimensions linear optimization overfitting papers privacy review science stat.ml training training data type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US