all AI news
LocalAI
May 14, 2023, 1:05 p.m. |
Simon Willison's Weblog simonwillison.net
"Self-hosted, community-driven, local OpenAI-compatible API". Designed to let you run local models such as those enabled by llama.cpp without rewriting your existing code that calls the OpenAI REST APIs. Reminds me of the various S3-compatible storage APIs that exist today.
Via Ian Johnson
ai api apis code community cpp generativeai homebrewllms ian llama llms openai rest self-hosted storage
More from simonwillison.net / Simon Willison's Weblog
Fast groq-hosted LLMs vs browser jank
1 day, 1 hour ago |
simonwillison.net
AI counter app from my PyCon US keynote
1 day, 23 hours ago |
simonwillison.net
Understand errors and warnings better with Gemini
2 days, 17 hours ago |
simonwillison.net
Commit: Add a shared credentials relationship from twitter.com to x.com
2 days, 19 hours ago |
simonwillison.net
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US