Web: http://arxiv.org/abs/2201.08471

Jan. 27, 2022, 2:10 a.m. | Suraj Nair, Eugene Yang, Dawn Lawrie, Kevin Duh, Paul McNamee, Kenton Murray, James Mayfield, Douglas W. Oard

cs.CL updates on arXiv.org arxiv.org

The advent of transformer-based models such as BERT has led to the rise of
neural ranking models. These models have improved the effectiveness of
retrieval systems well beyond that of lexical term matching models such as
BM25. While monolingual retrieval tasks have benefited from large-scale
training collections such as MS MARCO and advances in neural architectures,
cross-language retrieval tasks have fallen behind these advancements. This
paper introduces ColBERT-X, a generalization of the ColBERT
multi-representation dense retrieval model that uses the …

arxiv building cross language learning models retrieval transfer learning

Engineering Manager, Machine Learning (Credit Engineering)

@ Affirm | Remote Poland

Sr Data Engineer

@ Rappi | [CO] Bogotá

Senior Analytics Engineer

@ GetGround | Porto

Senior Staff Software Engineer, Data Engineering

@ Galileo, Inc. | New York City or Remote

Data Engineer

@ Atlassian | Bengaluru, India

Data Engineer | Hybrid (Pune)

@ Velotio | Pune, Maharashtra, India