all AI news
Meet DeepSeek-Coder-V2 by DeepSeek AI: The First Open-Source AI Model to Surpass GPT4-Turbo in Coding and Math, Supporting 338 Languages and 128K Context Length
June 19, 2024, 1:01 a.m. | /u/ai-lover
machinelearningnews www.reddit.com
DeepSeek-Coder-V2 employs a Mixture-of-Experts (MoE) framework, supporting 338 programming languages and extending the context from 16K to 128K tokens. The …
128k context ai model code coder coding context context length deepseek deepseek ai foundation gpt4 gpt4-turbo language language model languages machinelearningnews math open-source ai pre-training researchers tokens training turbo
More from www.reddit.com / machinelearningnews
Jobs in AI, ML, Big Data
AI Focused Biochemistry Postdoctoral Fellow
@ Lawrence Berkeley National Lab | Berkeley, CA
Senior Data Engineer
@ Displate | Warsaw
Hybrid Cloud Engineer
@ Vanguard | Wayne, PA
Senior Software Engineer
@ F5 | San Jose
Software Engineer, Backend, 3+ Years of Experience
@ Snap Inc. | Bellevue - 110 110th Ave NE
Global Head of Commercial Data Foundations
@ Sanofi | Cambridge