all AI news
GSplit: Scaling Graph Neural Network Training on Large Graphs via Split-Parallelism
June 28, 2024, 4:45 a.m. | Sandeep Polisetty, Juelin Liu, Kobi Falus, Yi Ren Fung, Seung-Hwan Lim, Hui Guan, Marco Serafini
cs.LG updates on arXiv.org arxiv.org
Abstract: Graph neural networks (GNNs), an emerging class of machine learning models for graphs, have gained popularity for their superior performance in various graph analytical tasks. Mini-batch training is commonly used to train GNNs on large graphs, and data parallelism is the standard approach to scale mini-batch training across multiple GPUs. One of the major performance costs in GNN training is the loading of input features, which prevents GPUs from being fully utilized. In this paper, …
abstract arxiv class cs.dc cs.lg data gnns graph graph neural network graph neural networks graphs machine machine learning machine learning models network networks network training neural network neural networks performance replace scaling split standard tasks train training type via
More from arxiv.org / cs.LG updates on arXiv.org
MixerFlow: MLP-Mixer meets Normalising Flows
2 days, 10 hours ago |
arxiv.org
Machine Learning-Enabled Software and System Architecture Frameworks
2 days, 10 hours ago |
arxiv.org
Kernelised Normalising Flows
2 days, 10 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Data Scientist
@ Ford Motor Company | Chennai, Tamil Nadu, India
Systems Software Engineer, Graphics
@ Parallelz | Vancouver, British Columbia, Canada - Remote
Engineering Manager - Geo Engineering Team (F/H/X)
@ AVIV Group | Paris, France
Data Analyst
@ Microsoft | San Antonio, Texas, United States
Azure Data Engineer
@ TechVedika | Hyderabad, India
Senior Data & AI Threat Detection Researcher (Cortex)
@ Palo Alto Networks | Tel Aviv-Yafo, Israel