all AI news
Towards Continual Knowledge Learning of Language Models. (arXiv:2110.03215v4 [cs.CL] UPDATED)
May 25, 2022, 1:12 a.m. | Joel Jang, Seonghyeon Ye, Sohee Yang, Joongbo Shin, Janghoon Han, Gyeonghun Kim, Stanley Jungkyu Choi, Minjoon Seo
cs.CL updates on arXiv.org arxiv.org
Large Language Models (LMs) are known to encode world knowledge in their
parameters as they pretrain on a vast amount of web corpus, which is often
utilized for performing knowledge-dependent downstream tasks such as question
answering, fact-checking, and open dialogue. In real-world scenarios, the world
knowledge stored in the LMs can quickly become outdated as the world changes,
but it is non-trivial to avoid catastrophic forgetting and reliably acquire new
knowledge while preserving invariant knowledge. To push the community towards …
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Technology Consultant Master Data Management (w/m/d)
@ SAP | Walldorf, DE, 69190
Research Engineer, Computer Vision, Google Research
@ Google | Nairobi, Kenya