Feb. 16, 2024, 8:07 p.m. | /u/daxow

Machine Learning www.reddit.com

TL;DR - How does Input Token Size relate to context window size? ChatGPT (128K context - 4096 input token limit) What about Gemini 1.5 (1M context window - ??? input token limit)

Since the Gemini 1.5 launched, I've been reading up more on it to see if it can replace the ChatGPT 3.5 we're using. Our use case has a lot of input text and we break it down into smaller texts and pass it to ChatGPT, because the input …

128k context chatgpt context context window gemini gemini 1.5 llm machinelearning reading token

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote