all AI news
[D] Training LLM with A100 vs 4x4090?
Jan. 5, 2024, 6:51 a.m. | /u/Electronic_Hawk524
Machine Learning www.reddit.com
I am looking to train a 7B model. Looks like 7B model will take 55 GB (using Adam as optimizer).
So, if I have a 4x4096 GPUs, is that even enough? If I train using DPO or rhf, which will have two models, will that make the GPU 3x?
Which one should I use, A100 or 4x4096?
~
More from www.reddit.com / Machine Learning
Non Technical ML Podcasts? [D]
1 day, 3 hours ago |
www.reddit.com
[D] PEFT techniques actually used in the industry
1 day, 6 hours ago |
www.reddit.com
Jobs in AI, ML, Big Data
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US