Jan. 29, 2024, 6:47 p.m. | Andrej Baranovskij

Andrej Baranovskij www.youtube.com

Haystack 2.0 provides functionality to process LLM output and ensure proper JSON structure, based on predefined Pydantic class. I show how you can run this on your local machine, with Ollama. This is possible thanks to OllamaGenerator class available from Haystack.

Sparrow GitHub repo:
https://github.com/katanaml/sparrow

Tutorial: Generating Structured Output with Loop-Based Auto-Correction:
https://haystack.deepset.ai/tutorials/28_structured_output_with_loop

0:00 Intro
0:40 OllamaGenerator
1:09 Pydantic classes
1:43 Output validator
2:54 Haystack pipeline
3:12 Ollama
3:25 Haystack pipeline
4:54 Example
6:15 Troubleshooting
6:19 Notus LLM
7:40 Result …

auto class github github repo haystack json llm loop machine ollama process pydantic rag show tutorial

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Director, Clinical Data Science

@ Aura | Remote USA

Research Scientist, AI (PhD)

@ Meta | Menlo Park, CA | New York City