Oct. 6, 2023, 2:57 a.m. | /u/ColumbiaGSAlum

Machine Learning www.reddit.com

Hi. I am a college student and I am trying to run deep learning models (hopefully LLMs one day) and my laptop keep crashing because of ram issue. So I am going to build a new desktop. I am thinking of buying 2 rtx 4090 and Parallelizing them instead of buying A100 because buying 2 rtx 4090 is half the cost of buying A100. But is there a downside of Parallelizing vs buying a single gpu with large vram? If …

a100 build college deep learning gpus issue laptop llms machinelearning rtx them thinking

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote