Nov. 21, 2023, 9:11 p.m. |

Computerworld www.computerworld.com



Anthropic has upped the ante for how much information a large language model (LLM) can consume at once, announcing on Tuesday that its just-released Claude 2.1 has a context window of 200,000 tokens. That's roughly the equivalent of 500,000 words or more than 500 printed pages of information, Anthropic said.

The latest Claude version also is more accurate than its predecessor, has a lower price, and includes beta tool use, the company said in its announcement.

To read this …

anthropic beta claude claude 2 claude 2.1 context context window information language language model large language large language model llm performance tokens tool tuesday words

More from www.computerworld.com / Computerworld

Senior Machine Learning Engineer

@ GPTZero | Toronto, Canada

ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)

@ HelloBetter | Remote

Doctoral Researcher (m/f/div) in Automated Processing of Bioimages

@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena

Coding Data Quality Auditor

@ Neuberger Berman | Work At Home-Georgia

Post Graduate (Year-Round) Intern - Market Research Analyst and Agreement Support

@ National Renewable Energy Laboratory | CO - Golden

Retail Analytics Engineering - Sr. Manager (Data)

@ Axalta | Woonsocket-1 CVS Drive