all AI news
Integrating Knowledge Graph embedding and pretrained Language Models in Hypercomplex Spaces. (arXiv:2208.02743v1 [cs.CL])
cs.CL updates on arXiv.org arxiv.org
Knowledge Graphs, such as Wikidata, comprise structural and textual knowledge
in order to represent knowledge. For each of the two modalities dedicated
approaches for graph embedding and language models learn patterns that allow
for predicting novel structural knowledge. Few approaches have integrated
learning and inference with both modalities and these existing ones could only
partially exploit the interaction of structural and textual knowledge. In our
approach, we build on existing strong representations of single modalities and
we use hypercomplex algebra …
arxiv embedding graph knowledge knowledge graph language language models