Jan. 29, 2024, 6:47 p.m. | Andrej Baranovskij

Andrej Baranovskij www.youtube.com

Haystack 2.0 provides functionality to process LLM output and ensure proper JSON structure, based on predefined Pydantic class. I show how you can run this on your local machine, with Ollama. This is possible thanks to OllamaGenerator class available from Haystack.

Sparrow GitHub repo:
https://github.com/katanaml/sparrow

Tutorial: Generating Structured Output with Loop-Based Auto-Correction:
https://haystack.deepset.ai/tutorials/28_structured_output_with_loop

0:00 Intro
0:40 OllamaGenerator
1:09 Pydantic classes
1:43 Output validator
2:54 Haystack pipeline
3:12 Ollama
3:25 Haystack pipeline
4:54 Example
6:15 Troubleshooting
6:19 Notus LLM
7:40 Result …

auto class github github repo haystack json llm loop machine ollama process pydantic rag show tutorial

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US