Dec. 22, 2023, 11:45 a.m. | Prompt Engineering

Prompt Engineering www.youtube.com

In this tutorial, we will walk through a step by step tutorial on how to fine tune Mixtral MoE from Mistral AI on your own dataset.

LINKS:
Colab (free T4 will not work): http://tinyurl.com/2hfk2fru
Mistral 7B fine-tune video: https://youtu.be/lCZRwrRvrWg
@AI-Makerspace

Want to Follow:
🦾 Discord: https://discord.com/invite/t4eYQRUcXB
▶️️ Subscribe: https://www.youtube.com/@engineerprompt?sub_confirmation=1

Want to Support:
☕ Buy me a Coffee: https://ko-fi.com/promptengineering
|🔴 Support my work on Patreon: Patreon.com/PromptEngineering

Need Help?
📧 Business Contact: engineerprompt@gmail.com
💼Consulting: https://calendly.com/engineerprompt/consulting-call

Join this channel to get access to …

business dataset gmail guide join mistral mistral ai mixtral mixtral 8x7b moe patreon promptengineering support through tutorial will work

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York