all AI news
Collective Relevance Labeling for Passage Retrieval. (arXiv:2205.03273v1 [cs.IR])
Web: http://arxiv.org/abs/2205.03273
May 9, 2022, 1:11 a.m. | Jihyuk Kim, Minsso Kim, Seung-won Hwang
cs.CL updates on arXiv.org arxiv.org
Deep learning for Information Retrieval (IR) requires a large amount of
high-quality query-document relevance labels, but such labels are inherently
sparse. Label smoothing redistributes some observed probability mass over
unobserved instances, often uniformly, uninformed of the true distribution. In
contrast, we propose knowledge distillation for informed labeling, without
incurring high computation overheads at evaluation time. Our contribution is
designing a simple but efficient teacher model which utilizes collective
knowledge, to outperform state-of-the-arts distilled from a more complex
teacher model. Specifically, …
More from arxiv.org / cs.CL updates on arXiv.org
Latest AI/ML/Big Data Jobs
Director, Applied Mathematics & Computational Research Division
@ Lawrence Berkeley National Lab | Berkeley, Ca
Business Data Analyst
@ MainStreet Family Care | Birmingham, AL
Assistant/Associate Professor of the Practice in Business Analytics
@ Georgetown University McDonough School of Business | Washington DC
Senior Data Science Writer
@ NannyML | Remote
Director of AI/ML Engineering
@ Armis Industries | Remote (US only), St. Louis, California
Digital Analytics Manager
@ Patagonia | Ventura, California