March 11, 2024, 12:26 p.m. | /u/Muted-Witness-7196

Machine Learning www.reddit.com

Google recently issued [technical report](https://arxiv.org/ftp/arxiv/papers/2403/2403.05530.pdf) about Gemini 1.5 Pro and its 10M context. They gave brief overview of modern approaches to improving the long-context capabilities of models (e.g. reccurent memory, ring attention, novel architectures, etc.), but I didn't find information about Gemini 1.5 Pro approach.

Did someone notice info about their approach in official publications? It seems to be Google's secret for me.

google machinelearning publications secret

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne