Oct. 18, 2022, 1:12 a.m. | Linlin Liu, Xin Li, Ruidan He, Lidong Bing, Shafiq Joty, Luo Si

cs.CL updates on arXiv.org arxiv.org

Knowledge-enhanced language representation learning has shown promising
results across various knowledge-intensive NLP tasks. However, prior methods
are limited in efficient utilization of multilingual knowledge graph (KG) data
for language model (LM) pretraining. They often train LMs with KGs in indirect
ways, relying on extra entity/relation embeddings to facilitate knowledge
injection. In this work, we explore methods to make better use of the
multilingual annotation and language agnostic property of KG triples, and
present novel knowledge based multilingual language models (KMLMs) …

arxiv knowledge language language model massive multilingual language model

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Sr. Software Development Manager, AWS Neuron Machine Learning Distributed Training

@ Amazon.com | Cupertino, California, USA