Feb. 12, 2024, 5:43 a.m. | Xi Chen Siwei Zhang Yun Xiong Xixi Wu Jiawei Zhang Xiangguo Sun Yao Zhang Yinglong Zhao

cs.LG updates on arXiv.org arxiv.org

Temporal Interaction Graphs (TIGs) are widely utilized to represent real-world systems. To facilitate representation learning on TIGs, researchers have proposed a series of TIG models. However, these models are still facing two tough gaps between the pre-training and downstream predictions in their ``pre-train, predict'' training paradigm. First, the temporal discrepancy between the pre-training and inference data severely undermines the models' applicability in distant future predictions on the dynamically evolving data. Second, the semantic divergence between pretext and downstream tasks hinders …

cs.ai cs.lg cs.si graphs paradigm predictions pre-training prompt prompt learning representation representation learning researchers series systems temporal train training world

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne