all AI news
Towards Generalised Pre-Training of Graph Models
May 15, 2024, 4:43 a.m. | Alex O. Davies, Riku W. Green, Nirav S. Ajmeri, Telmo M. Silva Filho
cs.LG updates on arXiv.org arxiv.org
Abstract: The principal benefit of unsupervised representation learning is that a pre-trained model can be fine-tuned where data or labels are scarce. Existing approaches for graph representation learning are domain specific, maintaining consistent node and edge features across the pre-training and target datasets. This has precluded transfer to multiple domains.
In this work we present Topology Only Pre-Training, a graph pre-training method based on node and edge feature exclusion. Separating graph learning into two stages, topology …
abstract arxiv benefit consistent cs.ai cs.lg data datasets domain domains edge features graph graph representation labels multiple node pre-trained model pre-training replace representation representation learning training transfer type unsupervised
More from arxiv.org / cs.LG updates on arXiv.org
Trainwreck: A damaging adversarial attack on image classifiers
1 day, 17 hours ago |
arxiv.org
Fast Controllable Diffusion Models for Undersampled MRI Reconstruction
1 day, 17 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer
@ GPTZero | Toronto, Canada
Sr. Data Operations
@ Carousell Group | West Jakarta, Indonesia
Senior Analyst, Business Intelligence & Reporting
@ Deutsche Bank | Bucharest
Business Intelligence Subject Matter Expert (SME) - Assistant Vice President
@ Deutsche Bank | Cary, 3000 CentreGreen Way
Enterprise Business Intelligence Specialist
@ NAIC | Kansas City
Senior Business Intelligence (BI) Developer - Associate
@ Deutsche Bank | Cary, 3000 CentreGreen Way