all AI news
Introducing Claude 2.1
Nov. 22, 2023, 4:28 a.m. |
Simon Willison's Weblog simonwillison.net
Anthropic's Claude used to have the longest token context of any of the major models: 100,000 tokens, which is about 300 pages. Then GPT-4 Turbo came out with 128,000 tokens and Claude lost one of its key differentiators.
Claude is back! Version 2.1, announced today, bumps the token limit up to 200,000 - and also adds support for OpenAI-style system prompts, a feature I've been really missing.
They also announced tool use, but that's only available for …
ai anthropic claude claude 2 claude 2.1 context generativeai gpt gpt-4 llms lost major token tokens turbo
More from simonwillison.net / Simon Willison's Weblog
Ham radio general exam question pool as JSON
1 day, 13 hours ago |
simonwillison.net
uv pip install --exclude-newer example
2 days, 16 hours ago |
simonwillison.net
Jobs in AI, ML, Big Data
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York