March 11, 2024, 4:42 a.m. | Pengcheng Xu, Tao Feng, Tianfan Fu, Siddhartha Laghuvarapu, Jimeng Sun

cs.LG updates on arXiv.org arxiv.org

arXiv:2310.05365v5 Announce Type: replace
Abstract: In this work, we introduce a method to fine-tune a Transformer-based generative model for molecular de novo design. Leveraging the superior sequence learning capacity of Transformers over Recurrent Neural Networks (RNNs), our model can generate molecular structures with desired properties effectively. In contrast to the traditional RNN-based models, our proposed method exhibits superior performance in generating compounds predicted to be active against various biological targets, capturing long-term dependencies in the molecular structure sequence. The model's …

abstract arxiv capacity contrast cs.ai cs.lg design generate generative networks neural networks recurrent neural networks reinforcement reinforcement learning rnn through transformer transformers type work

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Business Intelligence Architect - Specialist

@ Eastman | Hyderabad, IN, 500 008