Jan. 29, 2024, 11:47 a.m. | /u/Late_Special_6705

Deep Learning www.reddit.com

I'm building an inexpensive starter computer to start learning ML and came across cheap Tesla M40\\P40 24Gb RAM graphics cards.

Question: is it worth taking them now or to take something from this to begin with: 2060 12Gb, 2080 8Gb or 40608Gb?

If we compare the speed on the chart, they are 40% - 84% faster than the M40, but I suspect that everything will be different for ML. Who has such an experience?

building cards computer deeplearning graphics graphics cards question something speed tesla them

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US