June 23, 2022, 1:13 a.m. | Zhiqi Bu, Jialin Mao, Shiyun Xu

cs.CV updates on arXiv.org arxiv.org

Large convolutional neural networks (CNN) can be difficult to train in the
differentially private (DP) regime, since the optimization algorithms require a
computationally expensive operation, known as the per-sample gradient clipping.
We propose an efficient and scalable implementation of this clipping on
convolutional layers, termed as the mixed ghost clipping, that significantly
eases the private training in terms of both time and space complexities,
without affecting the accuracy. The improvement in efficiency is rigorously
studied through the first complexity analysis …

arxiv convolutional neural networks differential privacy lg networks neural networks privacy scalable training

(373) Applications Manager – Business Intelligence - BSTD

@ South African Reserve Bank | South Africa

Data Engineer Talend (confirmé/sénior) - H/F - CDI

@ Talan | Paris, France

Data Science Intern (Summer) / Stagiaire en données (été)

@ BetterSleep | Montreal, Quebec, Canada

Director - Master Data Management (REMOTE)

@ Wesco | Pittsburgh, PA, United States

Architect Systems BigData REF2649A

@ Deutsche Telekom IT Solutions | Budapest, Hungary

Data Product Coordinator

@ Nestlé | São Paulo, São Paulo, BR, 04730-000