Jan. 23, 2024, 7:45 a.m. | Andrej Baranovskij

Andrej Baranovskij www.youtube.com

In this video, I show how to get JSON output from Notus LLM running locally with Ollama. JSON output is generated with LlamaIndex using the dynamic Pydantic class approach.

Sparrow GitHub repo:
https://github.com/katanaml/sparrow

Argilla Notus repo:
https://github.com/argilla-io/notus

0:00 Intro
0:25 Notus in Sparrow
1:54 Ingest with Weaviate fix
2:40 JSON output with LlamaIndex
4:02 Sparrow engine
5:22 Example
7:15 Summary

CONNECT:
- Subscribe to this YouTube channel
- Twitter: https://twitter.com/andrejusb
- LinkedIn: https://www.linkedin.com/in/andrej-baranovskij/
- Medium: https://medium.com/@andrejusb

#llm #rag #python

class dynamic generated github github repo intro json llamaindex llm ollama pydantic running show video weaviate

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Principal Data Engineering Manager

@ Microsoft | Redmond, Washington, United States

Machine Learning Engineer

@ Apple | San Diego, California, United States