Nov. 13, 2023, 7:19 p.m. | Andrej Baranovskij

Andrej Baranovskij www.youtube.com

I explain how to compose a prompt for Mistral 7B LLM model running with LangChain and Ctransformers to retrieve output as JSON string, without any additional text.

GitHub repo:
https://github.com/katanaml/llm-mistral-invoice-cpu

Recommended channel for LLM:
https://www.youtube.com/@AIAnytime

0:00 Intro
0:45 Prompt template
2:25 Data ingestion
3:00 Data chunking
4:05 LLM
4:20 LangChain
5:17 JSON responses
9:40 Summary

CONNECT:
- Subscribe to this YouTube channel
- Twitter: https://twitter.com/andrejusb
- LinkedIn: https://www.linkedin.com/in/andrej-baranovskij/
- Medium: https://medium.com/@andrejusb

#llm #rag #python

data data ingestion github github repo intro json langchain llm mistral mistral 7b prompt responses running string template text

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Business Intelligence Architect - Specialist

@ Eastman | Hyderabad, IN, 500 008