Sept. 6, 2023, 1:10 p.m. | /u/LilHairdy

Machine Learning www.reddit.com

Hello everyone,

I've been pondering on sinusoidal positional encoding and its limitations. Does anybody know of a maximum sequence length that this absolute positional encoding may support? I'm coming from a deep reinforcement learning background, so I'm not too familiar with NLP papers, like I couldn't figure out the sequence length used in the original transformer paper.

Thanks in advance for any info!

encoding figure hello limitations machinelearning nlp positional encoding reinforcement reinforcement learning support

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne