Dec. 21, 2023, 11:45 a.m. | Prompt Engineering

Prompt Engineering www.youtube.com

Can you run Mixtral MoE from Mistral AI on Apple Silicon with MlX? Let's find out.

-------------------------------------------------------------------------------------
Steps to follow:

# Install mlx, mlx-examples, huggingface-cli
pip install mlx
pip install huggingface_hub hf_transfer
git clone https://github.com/ml-explore/mlx-examples.git

# Download model
export HF_HUB_ENABLE_HF_TRANSFER=1
huggingface-cli download --local-dir Mixtral-8x7B-Instruct-v0.1 mlx-community/Mixtral-8x7B-Instruct-v0.1

# Run example
python mlx-examples/mixtral/mixtral.py --model_path Mixtral-8x7B-Instruct-v0.1

-------------------------------------------------------------------------------------
LINKS:
MLX Huggingface: https://huggingface.co/mlx-community
Running MLX on Apple: https://youtu.be/FplJsVd2dTk

Want to Follow:
🦾 Discord: https://discord.com/invite/t4eYQRUcXB
▶️️ Subscribe: https://www.youtube.com/@engineerprompt?sub_confirmation=1

Want to Support:
☕ Buy me a Coffee: https://ko-fi.com/promptengineering …

apple apple silicon cli community download example examples export huggingface install mistral mistral ai mixtral moe pip python silicon

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Business Data Analyst

@ Alstom | Johannesburg, GT, ZA