Web: http://arxiv.org/abs/2209.08738

Sept. 21, 2022, 1:14 a.m. | Qiang Wang, Rongxiang Weng, Ming Chen

cs.CL updates on arXiv.org arxiv.org

K-Nearest Neighbor Neural Machine Translation (kNN-MT) successfully
incorporates external corpus by retrieving word-level representations at test
time. Generally, kNN-MT borrows the off-the-shelf context representation in the
translation task, e.g., the output of the last decoder layer, as the query
vector of the retrieval task. In this work, we highlight that coupling the
representations of these two tasks is sub-optimal for fine-grained retrieval.
To alleviate it, we leverage supervised contrastive learning to learn the
distinctive retrieval representation derived from the original …

arxiv machine machine translation neural machine translation representation retrieval translation

More from arxiv.org / cs.CL updates on arXiv.org

Machine Learning Product Manager (Canada, Remote)

@ FreshBooks | Canada

Data Engineer

@ Amazon.com | Irvine, California, USA

Senior Autonomy Behavior II, Performance Assessment Engineer

@ Cruise LLC | San Francisco, CA

Senior Data Analytics Engineer

@ Intercom | Dublin, Ireland

Data Analyst Intern

@ ADDX | Singapore

Data Science Analyst - Consumer

@ Yelp | London, England, United Kingdom

Senior Data Analyst - Python+Hadoop

@ Capco | India - Bengaluru

DevOps Engineer, Data Team

@ SingleStore | Hyderabad, India

Software Engineer (Machine Learning, AI Platform)

@ Phaidra | Remote

Sr. UI/UX Designer - Artificial Intelligence (ID:1213)

@ Truelogic Software | Remote, anywhere in LATAM

Analytics Engineer

@ carwow | London, England, United Kingdom

HRIS Data Analyst

@ SecurityScorecard | Remote