Dec. 25, 2023, 10:42 a.m. | /u/MustafaAlahmid

Machine Learning www.reddit.com

I want to finetune Mistral 7B and I have access to 8 A100 40GB and I'm doing a full finetune not Lora
Is this possible? Or I need A100 80GB at least?

How to calculate minimum requirements?

a100 least lora machinelearning mistral mistral 7b requirements

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote