all AI news
Making Pre-trained Language Models Better Continual Few-Shot Relation Extractors
Feb. 27, 2024, 5:49 a.m. | Shengkun Ma, Jiale Han, Yi Liang, Bo Cheng
cs.CL updates on arXiv.org arxiv.org
Abstract: Continual Few-shot Relation Extraction (CFRE) is a practical problem that requires the model to continuously learn novel relations while avoiding forgetting old ones with few labeled training data. The primary challenges are catastrophic forgetting and overfitting. This paper harnesses prompt learning to explore the implicit capabilities of pre-trained language models to address the above two challenges, thereby making language models better continual few-shot relation extractors. Specifically, we propose a Contrastive Prompt Learning framework, which designs …
abstract arxiv capabilities catastrophic forgetting challenges continual cs.ai cs.cl data explore extraction few-shot language language models learn making novel overfitting paper practical prompt prompt learning relations training training data type
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Data Engineer - Takealot Group (Takealot.com | Superbalist.com | Mr D Food)
@ takealot.com | Cape Town