all AI news
Privacy-Preserving Logistic Regression Training with a Faster Gradient Variant. (arXiv:2201.10838v1 [cs.CR])
Web: http://arxiv.org/abs/2201.10838
Jan. 27, 2022, 2:10 a.m. | John Chiang
cs.LG updates on arXiv.org arxiv.org
Logistic regression training on an encrypted dataset has been an attractive
idea to security concerns for years. In this paper, we propose a faster
gradient variant called Quadratic Gradient for logistic regression and
implement it via a special homomorphic encryption scheme. The core of this
gradient variant can be seen as an extension of the simplified fixed Hessian
from Newton's method, which extracts information from the Hessian matrix into
the naive gradient, and thus can be used to enhance Nesterov's …
arxiv gradient logistic regression privacy regression training
More from arxiv.org / cs.LG updates on arXiv.org
Latest AI/ML/Big Data Jobs
Senior Data Engineer
@ DAZN | Hammersmith, London, United Kingdom
Sr. Data Engineer, Growth
@ Netflix | Remote, United States
Data Engineer - Remote
@ Craft | Wrocław, Lower Silesian Voivodeship, Poland
Manager, Operations Data Science
@ Binance.US | Vancouver
Senior Machine Learning Researcher for Copilot
@ GitHub | Remote - Europe
Sr. Marketing Data Analyst
@ HoneyBook | San Francisco, CA