all AI news
Stability AI Open-Sources 7B Parameter Language Model StableLM
May 2, 2023, 1 p.m. | Anthony Alford
InfoQ - AI, ML & Data Engineering www.infoq.com
Stability AI released two sets of pre-trained model weights for StableLM, a suite of large language models (LLM). The models are trained on 1.5 trillion text tokens and are licensed for commercial use under CC BY-SA-4.0.
By Anthony Alfordai commercial deep learning language language model language models large language models llm ml & data engineering natural language processing stability ai stablelm text tokens
More from www.infoq.com / InfoQ - AI, ML & Data Engineering
Apple Open-Sources One Billion Parameter Language Model OpenELM
3 days, 22 hours ago |
www.infoq.com
Challenges and Solutions for Building Machine Learning Systems
1 week, 2 days ago |
www.infoq.com
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US