all AI news
RetroMAE: Pre-training Retrieval-oriented Transformers via Masked Auto-Encoder. (arXiv:2205.12035v1 [cs.CL])
May 25, 2022, 1:12 a.m. | Zheng Liu, Yingxia Shao
cs.CL updates on arXiv.org arxiv.org
Pre-trained models have demonstrated superior power on many important tasks.
However, it is still an open problem of designing effective pre-training
strategies so as to promote the models' usability on dense retrieval. In this
paper, we propose a novel pre-training framework for dense retrieval based on
the Masked Auto-Encoder, known as RetroMAE. Our proposed framework is
highlighted for the following critical designs: 1) a MAE based pre-training
workflow, where the input sentence is polluted on both encoder and decoder side …
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
Senior Data Engineer
@ Publicis Groupe | New York City, United States
Associate Principal Robotics Engineer - Research.
@ Dyson | United Kingdom - Hullavington Office
Duales Studium mit vertiefter Praxis: Bachelor of Science Künstliche Intelligenz und Data Science (m/w/d)
@ Gerresheimer | Wackersdorf, Germany
AI/ML Engineer (TS/SCI) {S}
@ ARKA Group, LP | Aurora, Colorado, United States
Data Integration Engineer
@ Find.co | Sliema
Data Engineer
@ Q2 | Bengaluru, India