all AI news
A Study of Continual Learning Under Language Shift. (arXiv:2311.01200v1 [cs.CL])
cs.LG updates on arXiv.org arxiv.org
The recent increase in data and model scale for language model pre-training
has led to huge training costs. In scenarios where new data become available
over time, updating a model instead of fully retraining it would therefore
provide significant gains. In this paper, we study the benefits and downsides
of updating a language model when new data comes from new languages - the case
of continual learning under language shift. Starting from a monolingual English
language model, we incrementally add …
arxiv become benefits continual costs data language language model paper pre-training scale shift study training training costs