Oct. 8, 2023, 3:43 p.m. | /u/jhanjeek

Deep Learning www.reddit.com

Hi All,

I have been a bit out of touch of the attention mechanisms. I know the core multihead attention used in attention is all you need paper but I think there have been some new developments in the field. Can someone help le with a list of the new attention mechanisms that I can start reading up on. I know of Flash Attention but I think that are even newer methods now. I tried googling but with not much …

attention attention is all you need attention mechanisms core deeplearning list paper think transformers

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US