Nov. 21, 2023, 9:11 p.m. |

Computerworld www.computerworld.com



Anthropic has upped the ante for how much information a large language model (LLM) can consume at once, announcing on Tuesday that its just-released Claude 2.1 has a context window of 200,000 tokens. That's roughly the equivalent of 500,000 words or more than 500 printed pages of information, Anthropic said.

The latest Claude version also is more accurate than its predecessor, has a lower price, and includes beta tool use, the company said in its announcement.

To read this …

anthropic beta claude claude 2 claude 2.1 context context window information language language model large language large language model llm performance tokens tool tuesday words

Manager, Global Codes Master Data Operations

@ The Coca-Cola Company | Bulgaria - Sofia

Analyst - Ops (Aidvantage)

@ Maximus | Remote, United States

Internship: Machine Learning for Interference rejection in Body Area Networks

@ NXP Semiconductors | Leuven

Junior Data Analyst - Short Term Gas

@ Verisk | Mexico City, Mexico

Data Engineer I - (Remote - US)

@ Mediavine | Austin, Texas, United States - Remote

Catalog Data Manager (Canada)

@ Fullscript | Ottawa, ON