Feb. 28, 2024, 9:44 p.m. | David Myriel

DEV Community dev.to

When Anthropic came out with a context window of 100K tokens, they said: “Vector search is dead. LLMs are getting more accurate and won’t need RAG anymore.


Google’s Gemini 1.5 now offers a context window of 10 million tokens. Their supporting paper claims victory over accuracy issues, even when applying Greg Kamradt’s NIAH methodology.


It’s over. RAG must be completely obsolete now. Right?


No.


Larger context windows are never the solution. Let me repeat. Never. They require …

accuracy ai anthropic context context window gemini gemini 1.5 google greg llms machinelearning methodology paper programming rag search tokens vector vector search

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York