April 16, 2024, 4:42 a.m. | Arnaud Pannatier, Evann Courdier, Fran\c{c}ois Fleuret

cs.LG updates on arXiv.org arxiv.org

arXiv:2404.09562v1 Announce Type: new
Abstract: Autoregressive models, such as the GPT family, use a fixed order, usually left-to-right, to generate sequences. However, this is not a necessity. In this paper, we challenge this assumption and show that by simply adding a positional encoding for the output, this order can be modulated on-the-fly per-sample which offers key advantageous properties. It allows for the sampling of and conditioning on arbitrary subsets of tokens, and it also allows sampling in one shot multiple …

abstract arxiv autoregressive autoregressive models challenge cs.ai cs.lg encoding family fly generate gpt gpts however paper per positional encoding sample show type

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Data Engineer - Takealot Group (Takealot.com | Superbalist.com | Mr D Food)

@ takealot.com | Cape Town