May 25, 2022, 1:12 a.m. | Terra Blevins, Hila Gonen, Luke Zettlemoyer

cs.CL updates on arXiv.org arxiv.org

The emergent cross-lingual transfer seen in multilingual pretrained models
has sparked significant interest in studying their behavior. However, because
these analyses have focused on fully trained multilingual models, little is
known about the dynamics of the multilingual pretraining process. We
investigate when these models acquire their in-language and cross-lingual
abilities by probing checkpoints taken from throughout XLM-R pretraining, using
a suite of linguistic tasks. Our analysis shows that the model achieves high
in-language performance early on, with lower-level linguistic skills …

arxiv cross-lingual dynamics language language models

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Lead Data Engineer

@ JPMorgan Chase & Co. | Jersey City, NJ, United States

Senior Machine Learning Engineer

@ TELUS | Vancouver, BC, CA

CT Technologist - Ambulatory Imaging - PRN

@ Duke University | Morriville, NC, US, 27560

BH Data Analyst

@ City of Philadelphia | Philadelphia, PA, United States