Jan. 28, 2024, 12:21 a.m. | WorldofAI

WorldofAI www.youtube.com

Introducing DeepSeek LLM, an advanced language model with 67 billion parameters, trained from scratch on an extensive dataset of 2 trillion tokens in both English and Chinese. To support research, we've released DeepSeek LLM 7B/67B Base and DeepSeek LLM 7B/67B Chat as open source for the research community.

🔥 Become a Patron (Private Discord): https://patreon.com/WorldofAi
☕ To help and Support me, Buy a Coffee or Donate to Support the Channel: https://ko-fi.com/worldofai - It would mean a lot if you did! …

advanced billion chat chinese coding community dataset deepseek deepseek llm english gpt gpt-4 language language model llm open source opensource parameters research research community support tokens

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York