Web: http://arxiv.org/abs/2201.09948

Jan. 26, 2022, 2:10 a.m. | Egbert Castro, Abhinav Godavarthi, Julian Rubinfien, Kevin B. Givechian, Dhananjay Bhaskar, Smita Krishnaswamy

cs.LG updates on arXiv.org arxiv.org

The development of powerful natural language models have increased the
ability to learn meaningful representations of protein sequences. In addition,
advances in high-throughput mutagenesis, directed evolution, and
next-generation sequencing have allowed for the accumulation of large amounts
of labeled fitness data. Leveraging these two trends, we introduce Regularized
Latent Space Optimization (ReLSO), a deep transformer-based autoencoder which
is trained to jointly generate sequences as well as predict fitness. Using
ReLSO, we explicitly model the underlying sequence-function landscape of large
labeled …

arxiv design transformers

More from arxiv.org / cs.LG updates on arXiv.org

Director, Data Engineering and Architecture

@ Chainalysis | California | New York | Washington DC | Remote - USA

Deep Learning Researcher

@ Topaz Labs | Dallas, TX

Sr Data Engineer (Contractor)

@ SADA | US - West

Senior Cloud Database Administrator

@ Findhelp | Remote

Senior Data Analyst

@ System1 | Remote

Speech Machine Learning Research Engineer

@ Samsung Research America | Mountain View, CA