all AI news
Self-attention Presents Low-dimensional Knowledge Graph Embeddings for Link Prediction. (arXiv:2112.10644v2 [cs.LG] UPDATED)
cs.LG updates on arXiv.org arxiv.org
A few models have tried to tackle the link prediction problem, also known as
knowledge graph completion, by embedding knowledge graphs in comparably lower
dimensions. However, the state-of-the-art results are attained at the cost of
considerably increasing the dimensionality of embeddings which causes
scalability issues in the case of huge knowledge bases. Transformers have been
successfully used recently as powerful encoders for knowledge graphs, but
available models still have scalability issues. To address this limitation, we
introduce a Transformer-based model …
arxiv attention graph knowledge knowledge graph lg link prediction prediction self-attention