Dec. 10, 2023, 7:58 p.m. | Andrej Baranovskij

Andrej Baranovskij www.youtube.com

LlamaIndex is an excellent choice for RAG implementation. It provides a perfect API to work with different data sources and extract data. LlamaIndex provides API for Ollama integration. This means we can easily use LlamaIndex with on-premise LLMs through Ollama. I explain a sample app where LlamaIndex works with Ollama to extract data from PDF invoices.

GitHub repo:
https://github.com/katanaml/llm-ollama-llamaindex-invoice-cpu

0:00 Intro
0:57 Libs
1:46 Config
2:58 Main script
4:43 RAG pipeline
7:02 Example
8:09 Summary

CONNECT:
- Subscribe to this …

api app data data extraction data sources extract extraction implementation integration llamaindex llms ollama on-premise rag sample through work

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York