Web: http://arxiv.org/abs/2201.11249

Jan. 28, 2022, 2:11 a.m. | Xinhang Li, Yong Zhang, Chunxiao Xing

cs.LG updates on arXiv.org arxiv.org

Entity alignment aims at integrating heterogeneous knowledge from different
knowledge graphs. Recent studies employ embedding-based methods by first
learning the representation of Knowledge Graphs and then performing entity
alignment via measuring the similarity between entity embeddings. However, they
failed to make good use of the relation semantic information due to the
trade-off problem caused by the different objectives of learning knowledge
embedding and neighborhood consensus. To address this problem, we propose
Relational Knowledge Distillation for Entity Alignment (RKDEA), a Graph …

arxiv distillation embedding learning

More from arxiv.org / cs.LG updates on arXiv.org

Data Analytics and Technical support Lead

@ Coupa Software, Inc. | Bogota, Colombia

Data Science Manager

@ Vectra | San Jose, CA

Data Analyst Sr

@ Capco | Brazil - Sao Paulo

Data Scientist (NLP)

@ Builder.ai | London, England, United Kingdom - Remote

Senior Data Analyst

@ BuildZoom | Scottsdale, AZ/ San Francisco, CA/ Remote

Senior Research Scientist, Speech Recognition

@ SoundHound Inc. | Toronto, Canada