all AI news
Mask and Reason: Pre-Training Knowledge Graph Transformers for Complex Logical Queries. (arXiv:2208.07638v1 [cs.LG])
cs.LG updates on arXiv.org arxiv.org
Knowledge graph (KG) embeddings have been a mainstream approach for reasoning
over incomplete KGs. However, limited by their inherently shallow and static
architectures, they can hardly deal with the rising focus on complex logical
queries, which comprise logical operators, imputed edges, multiple source
entities, and unknown intermediate entities. In this work, we present the
Knowledge Graph Transformer (kgTransformer) with masked pre-training and
fine-tuning strategies. We design a KG triple transformation method to enable
Transformer to handle KGs, which is further …
arxiv graph knowledge knowledge graph lg pre-training training transformers