Aug. 29, 2022, 1:13 a.m. | Ayana Niwa, Sho Takase, Naoaki Okazaki

cs.CL updates on arXiv.org arxiv.org

Non-autoregressive (NAR) models can generate sentences with less computation
than autoregressive models but sacrifice generation quality. Previous studies
addressed this issue through iterative decoding. This study proposes using
nearest neighbors as the initial state of an NAR decoder and editing them
iteratively. We present a novel training strategy to learn the edit operations
on neighbors to improve NAR text generation. Experimental results show that the
proposed method (NeighborEdit) achieves higher translation quality (1.69 points
higher than the vanilla Transformer) with …

arxiv generation text text generation

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US