June 15, 2023, 6:41 a.m. | /u/AaZasDass

Data Science www.reddit.com

OpenLLM allows you to run inferences with any open-source LLMs, deploy to the cloud or on-premises, and build powerful AI apps. It includes simple and familiar APIs, enabling easy integration with tools such as LangChain, and BentoML! You can find more [here](https://github.com/bentoml/OpenLLM).

To start, install it with pip: `pip install -U openllm`. It currently supports all significant SOTA LLMs, including Falcon, ChatGLM, Dolly V2, StableLM, and more! To start, do `openllm start:`

openllm start dolly-v2 --model-id databricks/dolly-v2-7b

To ask this …

call cli databricks datascience dolly falcon install llm llms pip query running sdk server sota stablelm transition

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote