April 27, 2022, 1:11 a.m. | Lai Wei, Qinyang Li, Yuqi Song, Stanislav Stefanov, Edirisuriya M. D. Siriwardane, Fanglin Chen, Jianjun Hu

cs.LG updates on arXiv.org arxiv.org

Self-supervised neural language models have recently achieved unprecedented
success, from natural language processing to learning the languages of
biological sequences and organic molecules. These models have demonstrated
superior performance in the generation, structure classification, and
functional predictions for proteins and molecules with learned representations.
However, most of the masking-based pre-trained language models are not designed
for generative design, and their black-box nature makes it difficult to
interpret their design logic. Here we propose BLMM Crystal Transformer, a
neural network based …

arxiv design language language model learning materials self-learning transformer

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Risk Management - Machine Learning and Model Delivery Services, Product Associate - Senior Associate-

@ JPMorgan Chase & Co. | Wilmington, DE, United States

Senior ML Engineer (Speech/ASR)

@ ObserveAI | Bengaluru