all AI news
huge V-RAM multi GPUs setup | need advices
Sept. 26, 2023, 3:55 p.m. | /u/Im_theBob
Deep Learning www.reddit.com
First of all, this is my very first post on reddit, my apologies if it has flaws.
I would like to make a CUDA set up and I know that V-RAM is very important in my applications. I’ve found that Nvidia Tesla K80 24Gb could be interesting but is very slow. My question is : can I use a 4090 as a computation unit and the teslas basically as V-RAM tanks ?
If yes, how may I ? …
applications cuda deeplearning flaws found gpus hello nvidia reddit set setup tesla
More from www.reddit.com / Deep Learning
A Visual Guide to GNN Sampling using PyTorch Geometric
2 days, 4 hours ago |
www.reddit.com
How can a transformer be equivariant?
3 days, 3 hours ago |
www.reddit.com
4060 ti 16gb or 4070 super 12gb?
3 days, 9 hours ago |
www.reddit.com
Is it possible to do "surgery" on a trained dataset for generative AI?
3 days, 12 hours ago |
www.reddit.com
Thoughts on New Transformer Stacking Paper
3 days, 23 hours ago |
www.reddit.com
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer
@ GPTZero | Toronto, Canada
ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)
@ HelloBetter | Remote
Doctoral Researcher (m/f/div) in Automated Processing of Bioimages
@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena
Seeking Developers and Engineers for AI T-Shirt Generator Project
@ Chevon Hicks | Remote
Senior Applied Data Scientist
@ dunnhumby | London
Principal Data Architect - Azure & Big Data
@ MGM Resorts International | Home Office - US, NV