all AI news
LP-BERT: Multi-task Pre-training Knowledge Graph BERT for Link Prediction. (arXiv:2201.04843v1 [cs.CL])
Jan. 14, 2022, 2:10 a.m. | Da Li, Ming Yi, Yukai He
cs.CL updates on arXiv.org arxiv.org
Link prediction plays an significant role in knowledge graph, which is an
important resource for many artificial intelligence tasks, but it is often
limited by incompleteness. In this paper, we propose knowledge graph BERT for
link prediction, named LP-BERT, which contains two training stages: multi-task
pre-training and knowledge graph fine-tuning. The pre-training strategy not
only uses Mask Language Model (MLM) to learn the knowledge of context corpus,
but also introduces Mask Entity Model (MEM) and Mask Relation Model (MRM),
which …
arxiv bert graph knowledge graph link prediction prediction training
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Program Control Data Analyst
@ Ford Motor Company | Mexico
Vice President, Business Intelligence / Data & Analytics
@ AlphaSense | Remote - United States