all AI news
Ollama
July 18, 2023, 9 p.m. |
Simon Willison's Weblog simonwillison.net
This tool for running LLMs on your own laptop directly includes an installer for macOS (Apple Silicon) and provides a terminal chat interface for interacting with models. They already have Llama 2 support working, with a model that downloads directly from their own registry service without need to register for an account or work your way through a waiting list.
ai apple apple silicon chat generativeai homebrewllms llama llama 2 llms macos running service silicon support terminal through tool work
More from simonwillison.net / Simon Willison's Weblog
I'm writing a new vector search SQLite Extension
2 days, 22 hours ago |
simonwillison.net
Printing music with CSS Grid
3 days, 10 hours ago |
simonwillison.net
We can have a different web
3 days, 22 hours ago |
simonwillison.net
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne