all AI news
Setup REST-API service of AI by using Local LLMs with Ollama
May 9, 2024, 2:55 p.m. | Dexter
DEV Community dev.to
Setting up a REST API service for AI using Local LLMs with Ollama seems like a practical approach. Here’s a simple workflow.
1 Install Ollama and LLMs
Begin by installing Ollama and the Local LLMs on your local machine. Ollama facilitates the local deployment of LLMs, making it easier to manage and utilize them for various tasks.
Install Ollama
Install LLMs for Ollama
ollama pull llama3
ollama run llama3
Ollama Commands
Available Commands:
/set Set session variables
/show Show model …
ai api deployment fastapi install llama3 llms machine making ollama practical rest rest api service setup simple workflow
More from dev.to / DEV Community
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US