all AI news
microsoft/Phi-3-mini-4k-instruct-gguf
Simon Willison's Weblog simonwillison.net
microsoft/Phi-3-mini-4k-instruct-gguf
Microsoft's Phi-3 LLM is out and it's really impressive. This 4,000 token context GGUF model is just a 2.2GB (for the Q4 version) and ran on my Mac using the llamafile option described in the README. I could then run prompts through it using the llm-llamafile plugin.
The vibes are good! Initial test prompts I've tried feel similar to much larger 7B models, despite using just a few GBs of RAM. Tokens are returned fast too - it feels …
ai context generativeai good homebrewllms llm llms mac microsoft phi plugin prompts ran readme test through token vibes