Jan. 23, 2024, 7:45 a.m. | Andrej Baranovskij

Andrej Baranovskij www.youtube.com

In this video, I show how to get JSON output from Notus LLM running locally with Ollama. JSON output is generated with LlamaIndex using the dynamic Pydantic class approach.

Sparrow GitHub repo:
https://github.com/katanaml/sparrow

Argilla Notus repo:
https://github.com/argilla-io/notus

0:00 Intro
0:25 Notus in Sparrow
1:54 Ingest with Weaviate fix
2:40 JSON output with LlamaIndex
4:02 Sparrow engine
5:22 Example
7:15 Summary

CONNECT:
- Subscribe to this YouTube channel
- Twitter: https://twitter.com/andrejusb
- LinkedIn: https://www.linkedin.com/in/andrej-baranovskij/
- Medium: https://medium.com/@andrejusb

#llm #rag #python

class dynamic generated github github repo intro json llamaindex llm ollama pydantic running show video weaviate

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US