all AI news
Flaky Performances when Pretraining on Relational Databases. (arXiv:2211.05213v1 [cs.LG])
Nov. 11, 2022, 2:11 a.m. | Shengchao Liu, David Vazquez, Jian Tang, Pierre-André Noël
cs.LG updates on arXiv.org arxiv.org
We explore the downstream task performances for graph neural network (GNN)
self-supervised learning (SSL) methods trained on subgraphs extracted from
relational databases (RDBs). Intuitively, this joint use of SSL and GNNs should
allow to leverage more of the available data, which could translate to better
results. However, we found that naively porting contrastive SSL techniques can
cause ``negative transfer'': linear evaluation on fixed representations from a
pretrained model performs worse than on representations from the
randomly-initialized model. Based on the …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote