Jan. 12, 2022, 2:11 a.m. | Minjae Park

cs.LG updates on arXiv.org arxiv.org

Heterogeneous graph neural networks can represent information of
heterogeneous graphs with excellent ability. Recently, self-supervised learning
manner is researched which learns the unique expression of a graph through a
contrastive learning method. In the absence of labels, this learning methods
show great potential. However, contrastive learning relies heavily on positive
and negative pairs, and generating high-quality pairs from heterogeneous graphs
is difficult. In this paper, in line with recent innovations in self-supervised
learning called BYOL or bootstrapping, we introduce a …

arxiv bootstrapping graph learning network neural network self-supervised learning supervised learning

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Data Strategy & Management - Private Equity Sector - Manager - Consulting - Location OPEN

@ EY | New York City, US, 10001-8604

Data Engineer- People Analytics

@ Volvo Group | Gothenburg, SE, 40531