Aug. 17, 2022, 1:10 a.m. | Xiao Liu, Shiyu Zhao, Kai Su, Yukuo Cen, Jiezhong Qiu, Mengdi Zhang, Wei Wu, Yuxiao Dong, Jie Tang

cs.LG updates on arXiv.org arxiv.org

Knowledge graph (KG) embeddings have been a mainstream approach for reasoning
over incomplete KGs. However, limited by their inherently shallow and static
architectures, they can hardly deal with the rising focus on complex logical
queries, which comprise logical operators, imputed edges, multiple source
entities, and unknown intermediate entities. In this work, we present the
Knowledge Graph Transformer (kgTransformer) with masked pre-training and
fine-tuning strategies. We design a KG triple transformation method to enable
Transformer to handle KGs, which is further …

arxiv graph knowledge knowledge graph lg pre-training training transformers

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Business Intelligence Analyst

@ Rappi | COL-Bogotá

Applied Scientist II

@ Microsoft | Redmond, Washington, United States