all AI news
2x RTX A100 80GB vs 3x RTX 6000 ADA 48GB GPUs for LLM/ViT inference and training?
April 16, 2023, 10:41 a.m. | /u/lolman2215
Deep Learning www.reddit.com
How do the latest A100 80GB GPUs compare with the new RTX 6000 ADA 48GB when
a) Training LLMs?
b) Performing inference with LLMs?
The 2x A100 setup provides 160GB VRAM, the 3x 6000 provides 144. But probably more data transfer between GPUs is a bottleneck.
a100 ada building data deep learning deeplearning general gpus guidelines inference llm llms rtx rtx 6000 setup small training training llms transfer vit workstation
More from www.reddit.com / Deep Learning
Is DS only for people with good work experience?!?
1 day, 18 hours ago |
www.reddit.com
Need ideas for Final Year Project!
2 days, 6 hours ago |
www.reddit.com
The Vibe I get from the KAN paper
2 days, 15 hours ago |
www.reddit.com
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Data Engineer
@ Kaseya | Bengaluru, Karnataka, India