June 15, 2022, 1:12 a.m. | Eric Malmi, Yue Dong, Jonathan Mallinson, Aleksandr Chuklin, Jakub Adamek, Daniil Mirylenka, Felix Stahlberg, Sebastian Krause, Shankar Kumar, Aliakse

cs.CL updates on arXiv.org arxiv.org

Text-editing models have recently become a prominent alternative to seq2seq
models for monolingual text-generation tasks such as grammatical error
correction, simplification, and style transfer. These tasks share a common
trait - they exhibit a large amount of textual overlap between the source and
target texts. Text-editing models take advantage of this observation and learn
to generate the output by predicting edit operations applied to the source
sequence. In contrast, seq2seq models generate outputs word-by-word from
scratch thus making them slow …

arxiv generation text text generation

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Machine Learning Engineer (m/f/d)

@ StepStone Group | Düsseldorf, Germany

2024 GDIA AI/ML Scientist - Supplemental

@ Ford Motor Company | United States