Feb. 21, 2024, 5:42 a.m. | Rui Jiao, Xiangzhe Kong, Ziyang Yu, Wenbing Huang, Yang Liu

cs.LG updates on arXiv.org arxiv.org

arXiv:2402.12714v1 Announce Type: new
Abstract: Pretraining on a large number of unlabeled 3D molecules has showcased superiority in various scientific applications. However, prior efforts typically focus on pretraining models on a specific domain, either proteins or small molecules, missing the opportunity to leverage the cross-domain knowledge. To mitigate this gap, we introduce Equivariant Pretrained Transformer (EPT), a novel pretraining framework designed to harmonize the geometric learning of small molecules and proteins. To be specific, EPT unifies the geometric modeling of …

abstract applications arxiv cs.lg domain domain knowledge focus knowledge molecules pretraining prior proteins small transformer type

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne