all AI news
BTLM-3B-8k-base brings LLM capabilities to devices with just 3GB of memory
Aug. 2, 2023, 11:52 a.m. | Matthias Bastian
THE DECODER the-decoder.com
Cerebras and Opentensor have trained a powerful 3 billion parameter language model with an 8k context length window.
The article BTLM-3B-8k-base brings LLM capabilities to devices with just 3GB of memory appeared first on THE DECODER.
ai and language ai in practice article artificial intelligence billion cerebras context decoder devices language language model llm memory
More from the-decoder.com / THE DECODER
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US