all AI news
[R] Infinite context Transformers
April 11, 2024, 5:35 p.m. | /u/Dyoakom
Machine Learning www.reddit.com
[https://arxiv.org/abs/2404.07143](https://arxiv.org/abs/2404.07143)
What are your thoughts? Could it be one of the techniques behind the Gemini 1.5 reported 10m token context length?
context gemini gemini 1.5 look machinelearning paper thoughts thread token transformers
More from www.reddit.com / Machine Learning
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
#13721 - Data Engineer - AI Model Testing
@ Qualitest | Miami, Florida, United States
Elasticsearch Administrator
@ ManTech | 201BF - Customer Site, Chantilly, VA