all AI news
Dropout against Deep Leakage from Gradients. (arXiv:2108.11106v2 [cs.LG] UPDATED)
Nov. 16, 2022, 2:12 a.m. | Yanchong Zheng
cs.LG updates on arXiv.org arxiv.org
As the scale and size of the data increases significantly nowadays, federal
learning (Bonawitz et al. [2019]) for high performance computing and machine
learning has been much more important than ever before (Abadi et al. [2016]).
People used to believe that sharing gradients seems to be safe to conceal the
local training data during the training stage. However, Zhu et al. [2019]
demonstrated that it was possible to recover raw data from the model training
data by detecting gradients. They …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne