Nov. 21, 2023, 9:11 p.m. |

Computerworld www.computerworld.com



Anthropic has upped the ante for how much information a large language model (LLM) can consume at once, announcing on Tuesday that its just-released Claude 2.1 has a context window of 200,000 tokens. That's roughly the equivalent of 500,000 words or more than 500 printed pages of information, Anthropic said.

The latest Claude version also is more accurate than its predecessor, has a lower price, and includes beta tool use, the company said in its announcement.

To read this …

anthropic beta claude claude 2 claude 2.1 context context window information language language model large language large language model llm performance tokens tool tuesday words

More from www.computerworld.com / Computerworld

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York