all AI news
[R] Infinite context Transformers
April 11, 2024, 5:35 p.m. | /u/Dyoakom
Machine Learning www.reddit.com
[https://arxiv.org/abs/2404.07143](https://arxiv.org/abs/2404.07143)
What are your thoughts? Could it be one of the techniques behind the Gemini 1.5 reported 10m token context length?
context gemini gemini 1.5 look machinelearning paper thoughts thread token transformers
More from www.reddit.com / Machine Learning
[D] Does DSPy actually change the LM weights?
1 day, 1 hour ago |
www.reddit.com
[D] Culture of Recycling Old Conference Submissions in ML
1 day, 3 hours ago |
www.reddit.com
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US