April 9, 2024, 4:51 a.m. | Vladimir Solovyev, Danni Liu, Jan Niehues

cs.CL updates on arXiv.org arxiv.org

arXiv:2404.05720v1 Announce Type: new
Abstract: Finetuning pretrained models on downstream generation tasks often leads to catastrophic forgetting in zero-shot conditions. In this work, we focus on summarization and tackle the problem through the lens of language-independent representations. After training on monolingual summarization, we perform zero-shot transfer to new languages or language pairs. We first show naively finetuned models are highly language-specific in both output behavior and internal representations, resulting in poor zero-shot performance. Next, we propose query-key (QK) finetuning to …

abstract arxiv catastrophic forgetting cs.ai cs.cl finetuning focus independent language languages leads lens pretrained models show summarization tasks through training transfer type work zero-shot

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

AI Engineering Manager

@ M47 Labs | Barcelona, Catalunya [Cataluña], Spain