Jan. 15, 2024, 5:49 p.m. | /u/karun_kodes

Machine Learning www.reddit.com

So I was just reading about absolute positional encoding and then about relative positional embedding.

All I could understand is how's it done with relative to each word. But I really couldn't think of the advantages over absolute one as the "Attention is all you need" also states that ""We chose this function because we hypothesized it would allow the model to easily learn to attend by *relative positions.." So what's the advantage relative positional encoding carries ?.*


And can …

advantages attention attention is all you need embedding encoding machinelearning positional encoding reading think word

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US