April 25, 2024, 5:45 p.m. | Timour Igamberdiev, Doan Nam Long Vu, Felix K\"unnecke, Zhuo Yu, Jannik Holmer, Ivan Habernal

cs.CL updates on arXiv.org arxiv.org

arXiv:2311.14465v2 Announce Type: replace
Abstract: Neural machine translation (NMT) is a widely popular text generation task, yet there is a considerable research gap in the development of privacy-preserving NMT models, despite significant data privacy concerns for NMT systems. Differentially private stochastic gradient descent (DP-SGD) is a popular method for training machine learning models with concrete privacy guarantees; however, the implementation specifics of training a model with DP-SGD are not always clarified in existing models, with differing software libraries used and …

abstract arxiv concerns concrete cs.cl data data privacy development gap gradient machine machine learning machine learning models machine translation neural machine translation popular privacy research scalable stochastic systems text text generation training translation type

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne