Nov. 13, 2023, 7:19 p.m. | Andrej Baranovskij

Andrej Baranovskij www.youtube.com

I explain how to compose a prompt for Mistral 7B LLM model running with LangChain and Ctransformers to retrieve output as JSON string, without any additional text.

GitHub repo:
https://github.com/katanaml/llm-mistral-invoice-cpu

Recommended channel for LLM:
https://www.youtube.com/@AIAnytime

0:00 Intro
0:45 Prompt template
2:25 Data ingestion
3:00 Data chunking
4:05 LLM
4:20 LangChain
5:17 JSON responses
9:40 Summary

CONNECT:
- Subscribe to this YouTube channel
- Twitter: https://twitter.com/andrejusb
- LinkedIn: https://www.linkedin.com/in/andrej-baranovskij/
- Medium: https://medium.com/@andrejusb

#llm #rag #python

data data ingestion github github repo intro json langchain llm mistral mistral 7b prompt responses running string template text

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US