all AI news
LocalAI
May 14, 2023, 1:05 p.m. |
Simon Willison's Weblog simonwillison.net
"Self-hosted, community-driven, local OpenAI-compatible API". Designed to let you run local models such as those enabled by llama.cpp without rewriting your existing code that calls the OpenAI REST APIs. Reminds me of the various S3-compatible storage APIs that exist today.
Via Ian Johnson
ai api apis code community cpp generativeai homebrewllms ian llama llms openai rest self-hosted storage
More from simonwillison.net / Simon Willison's Weblog
microsoft/Phi-3-mini-4k-instruct-gguf
1 day, 5 hours ago |
simonwillison.net
Quoting Phi-3 Technical Report
1 day, 19 hours ago |
simonwillison.net
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Social Insights & Data Analyst (Freelance)
@ Media.Monks | Jakarta
Cloud Data Engineer
@ Arkatechture | Portland, ME, USA