all AI news
DP-NMT: Scalable Differentially-Private Machine Translation
April 25, 2024, 5:45 p.m. | Timour Igamberdiev, Doan Nam Long Vu, Felix K\"unnecke, Zhuo Yu, Jannik Holmer, Ivan Habernal
cs.CL updates on arXiv.org arxiv.org
Abstract: Neural machine translation (NMT) is a widely popular text generation task, yet there is a considerable research gap in the development of privacy-preserving NMT models, despite significant data privacy concerns for NMT systems. Differentially private stochastic gradient descent (DP-SGD) is a popular method for training machine learning models with concrete privacy guarantees; however, the implementation specifics of training a model with DP-SGD are not always clarified in existing models, with differing software libraries used and …
abstract arxiv concerns concrete cs.cl data data privacy development gap gradient machine machine learning machine learning models machine translation neural machine translation popular privacy research scalable stochastic systems text text generation training translation type
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US