Nov. 9, 2022, 2:15 a.m. | Wenhao Zhu, Shujian Huang, Yunzhe Lv, Xin Zheng, Jiajun Chen

cs.CL updates on arXiv.org arxiv.org

kNN-MT presents a new paradigm for domain adaptation by building an external
datastore, which usually saves all target language token occurrences in the
parallel corpus. As a result, the constructed datastore is usually large and
possibly redundant. In this paper, we investigate the interpretability issue of
this approach: what knowledge does the NMT model need? We propose the notion of
local correctness (LAC) as a new angle, which describes the potential
translation correctness for a single entry and for a …

arxiv domain adaptation knn knowledge memory

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Sr. BI Analyst

@ AkzoNobel | Pune, IN