Jan. 8, 2024, 8:19 a.m. | Andrej Baranovskij

Andrej Baranovskij www.youtube.com

This is Sparrow, our open-source solution for document processing with local LLMs. I'm running local Starling LLM with Ollama. I explain how to get structured JSON output with LlamaIndex and dynamic Pydantic class. This helps to implement the use case of data extraction from invoice documents. The solution runs on the local machine, thanks to Ollama. I'm using a MacBook Air M1 with 8GB RAM.

Sparrow GitHub repo:
https://github.com/katanaml/sparrow

0:00 Intro
0:42 Example
2:29 Config
3:08 RAG with Sparrow and …

case class data data extraction document document processing documents dynamic extraction invoice json llamaindex llm llms ollama processing pydantic running solution

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US