all AI news
Error Norm Truncation: Robust Training in the Presence of Data Noise for Text Generation Models
March 20, 2024, 4:48 a.m. | Tianjian Li, Haoran Xu, Philipp Koehn, Daniel Khashabi, Kenton Murray
cs.CL updates on arXiv.org arxiv.org
Abstract: Text generation models are notoriously vulnerable to errors in the training data. With the wide-spread availability of massive amounts of web-crawled data becoming more commonplace, how can we enhance the robustness of models trained on a massive amount of noisy web-crawled text? In our work, we propose Error Norm Truncation (ENT), a robust enhancement method to the standard training objective that truncates noisy data. Compared to methods that only uses the negative log-likelihood loss to …
abstract arxiv availability cs.cl data error errors massive noise norm robust robustness text text generation training training data type vulnerable web
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne