Feb. 18, 2024, 12:55 p.m. | /u/Wiskkey

Machine Learning www.reddit.com

From [What is a long context window?](https://blog.google/technology/ai/long-context-window-ai-models/):

>"Our original plan was to achieve 128,000 tokens in context, and I thought setting an ambitious bar would be good, so I suggested 1 million tokens," says Google DeepMind Research Scientist Nikolay Savinov, one of the research leads on the long context project. “And now we’ve even surpassed that in our research by 10x.”
>
>To make this kind of leap forward, the team had to make a series of deep learning innovations. …

context deepmind deepmind research good google google deepmind kind leads machinelearning project research research scientist thought tokens

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US