all AI news
No Parameter Left Behind: How Distillation and Model Size Affect Zero-Shot Retrieval. (arXiv:2206.02873v2 [cs.IR] UPDATED)
Web: http://arxiv.org/abs/2206.02873
June 16, 2022, 1:12 a.m. | Guilherme Moraes Rosa, Luiz Bonifacio, Vitor Jeronymo, Hugo Abonizio, Marzieh Fadaee, Roberto Lotufo, Rodrigo Nogueira
cs.CL updates on arXiv.org arxiv.org
Recent work has shown that small distilled language models are strong
competitors to models that are orders of magnitude larger and slower in a wide
range of information retrieval tasks. This has made distilled and dense models,
due to latency constraints, the go-to choice for deployment in real-world
retrieval applications. In this work, we question this practice by showing that
the number of parameters and early query-document interaction play a
significant role in the generalization ability of retrieval models. Our …
More from arxiv.org / cs.CL updates on arXiv.org
Latest AI/ML/Big Data Jobs
Machine Learning Researcher - Saalfeld Lab
@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia
Project Director, Machine Learning in US Health
@ ideas42.org | Remote, US
Data Science Intern
@ NannyML | Remote
Machine Learning Engineer NLP/Speech
@ Play.ht | Remote
Research Scientist, 3D Reconstruction
@ Yembo | Remote, US
Clinical Assistant or Associate Professor of Management Science and Systems
@ University at Buffalo | Buffalo, NY