May 15, 2023, 12:44 a.m. | Dominic Masters, Josef Dean, Kerstin Klaser, Zhiyi Li, Sam Maddrell-Mander, Adam Sanders, Hatem Helal, Deniz Beker, Andrew Fitzgibbon, Shenyang Huang,

cs.LG updates on arXiv.org arxiv.org

We present GPS++, a hybrid Message Passing Neural Network / Graph Transformer
model for molecular property prediction. Our model integrates a well-tuned
local message passing component and biased global attention with other key
ideas from prior literature to achieve state-of-the-art results on large-scale
molecular dataset PCQM4Mv2. Through a thorough ablation study we highlight the
impact of individual components and find that nearly all of the model's
performance can be maintained without any use of global self-attention, showing
that message passing …

art arxiv attention dataset global global attention gps graph hybrid ideas literature network neural network prediction prior property scale state transformer transformer model

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

AI Engineering Manager

@ M47 Labs | Barcelona, Catalunya [Cataluña], Spain