Feb. 9, 2024, 5:42 a.m. | Gangda Deng Hongkuan Zhou Hanqing Zeng Yinglong Xia Christopher Leung Jianbo Li Rajgopal Kannan

cs.LG updates on arXiv.org arxiv.org

Recently, Temporal Graph Neural Networks (TGNNs) have demonstrated state-of-the-art performance in various high-impact applications, including fraud detection and content recommendation. Despite the success of TGNNs, they are prone to the prevalent noise found in real-world dynamic graphs like time-deprecated links and skewed interaction distribution. The noise causes two critical issues that significantly compromise the accuracy of TGNNs: (1) models are supervised by inferior interactions, and (2) noisy input induces high variance in the aggregated messages. However, current TGNN denoising techniques …

applications art cs.ai cs.lg detection distribution dynamic found fraud fraud detection graph graph neural networks graph representation graphs impact networks neural networks noise performance recommendation representation representation learning sampling state success taser temporal world

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Principal Machine Learning Engineer (AI, NLP, LLM, Generative AI)

@ Palo Alto Networks | Santa Clara, CA, United States

Consultant Senior Data Engineer F/H

@ Devoteam | Nantes, France