all AI news
Crystal Transformer: Self-learning neural language model for Generative and Tinkering Design of Materials. (arXiv:2204.11953v1 [cond-mat.mtrl-sci])
cs.LG updates on arXiv.org arxiv.org
Self-supervised neural language models have recently achieved unprecedented
success, from natural language processing to learning the languages of
biological sequences and organic molecules. These models have demonstrated
superior performance in the generation, structure classification, and
functional predictions for proteins and molecules with learned representations.
However, most of the masking-based pre-trained language models are not designed
for generative design, and their black-box nature makes it difficult to
interpret their design logic. Here we propose BLMM Crystal Transformer, a
neural network based …
arxiv design language language model learning materials self-learning transformer