Feb. 14, 2024, 5:43 a.m. | Ahmad Naser Eddin Jacopo Bono David Apar\'icio Hugo Ferreira Jo\~ao Ascens\~ao Pedro Ribeiro Pedro Biz

cs.LG updates on arXiv.org arxiv.org

Many real-world datasets have an underlying dynamic graph structure, where entities and their interactions evolve over time. Machine learning models should consider these dynamics in order to harness their full potential in downstream tasks. Previous approaches for graph representation learning have focused on either sampling k-hop neighborhoods, akin to breadth-first search, or random walks, akin to depth-first search. However, these methods are computationally expensive and unsuitable for real-time, low-latency inference on dynamic graphs. To overcome these limitations, we propose graph-sprints …

continuous cs.lg datasets dynamic dynamics embedding framework graph graph representation graphs harness interactions latency low machine machine learning machine learning models node random representation representation learning sampling tasks world

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne