May 6, 2024, 7:19 a.m. | Andrej Baranovskij

Andrej Baranovskij www.youtube.com

Documents like invoices or receipts can be processed by LLM directly, without RAG. I explain how you can do this locally with Ollama and Instructor. Thanks to Instructor, structured output from LLM can be validated with your own Pydantic class.

Sparrow GitHub repo:
https://github.com/katanaml/sparrow

0:00 Intro
0:36 Libraries
1:22 Example
4:11 Code
5:40 Summary

CONNECT:
- Subscribe to this YouTube channel
- Twitter: https://twitter.com/andrejusb
- LinkedIn: https://www.linkedin.com/in/andrej-baranovskij/
- Medium: https://medium.com/@andrejusb

#llm #rag #instructor

class code data documents example extract github github repo intro invoice libraries llm ollama pydantic rag

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US