all AI news
Knowledge Graph Completion with Pre-trained Multimodal Transformer and Twins Negative Sampling. (arXiv:2209.07084v1 [cs.AI])
cs.CL updates on arXiv.org arxiv.org
Knowledge graphs (KGs) that modelings the world knowledge as structural
triples are inevitably incomplete. Such problems still exist for multimodal
knowledge graphs (MMKGs). Thus, knowledge graph completion (KGC) is of great
importance to predict the missing triples in the existing KGs. As for the
existing KGC methods, embedding-based methods rely on manual design to leverage
multimodal information while finetune-based approaches are not superior to
embedding-based methods in link prediction. To address these problems, we
propose a VisualBERT-enhanced Knowledge Graph Completion …
arxiv graph knowledge knowledge graph multimodal negative sampling transformer twins