April 11, 2024, 5:35 p.m. | /u/Dyoakom

Machine Learning www.reddit.com

I took a look and didn't see any discussion thread here on this paper which looks perhaps promising.


[https://arxiv.org/abs/2404.07143](https://arxiv.org/abs/2404.07143)


What are your thoughts? Could it be one of the techniques behind the Gemini 1.5 reported 10m token context length?

context gemini gemini 1.5 look machinelearning paper thoughts thread token transformers

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US