all AI news
Enhancing Multilingual Language Model with Massive Multilingual Knowledge Triples. (arXiv:2111.10962v4 [cs.CL] UPDATED)
cs.CL updates on arXiv.org arxiv.org
Knowledge-enhanced language representation learning has shown promising
results across various knowledge-intensive NLP tasks. However, prior methods
are limited in efficient utilization of multilingual knowledge graph (KG) data
for language model (LM) pretraining. They often train LMs with KGs in indirect
ways, relying on extra entity/relation embeddings to facilitate knowledge
injection. In this work, we explore methods to make better use of the
multilingual annotation and language agnostic property of KG triples, and
present novel knowledge based multilingual language models (KMLMs) …
arxiv knowledge language language model massive multilingual language model