all AI news
MLC LLM
Simon Willison's Weblog simonwillison.net
From MLC, the team that gave us Web LLM and Web Stable Diffusion. "MLC LLM is a universal solution that allows any language model to be deployed natively on a diverse set of hardware backends and native applications". I installed their iPhone demo from TestFlight this morning and it does indeed provide an offline LLM that runs on my phone. It's reasonably capable - the underlying model for the app is vicuna-v1-7b, a LLaMA derivative.
Via @simonw
ai app applications demo diffusion diverse generativeai hardware homebrewllms iphone language language model llama llm llms mlc offline phone set solution stable diffusion team vicuna web