all AI news
Training on GTX 1060 Ti is faster than RTX 3060 Ti
Jan. 19, 2024, 9:53 a.m. | /u/blooming17
Deep Learning www.reddit.com
Hello, I am training a UNet with atrous 2d convolutions, I am experimenting with various GPUs and noticed that RTX 3060 Ti is taking way much time in training compared to GTX 1650 Ti which normally shouldn't be the case.
Has anyone encountered the same problem ? What do you think the problem might be ?
PS: The model has around 2.3 million parameters.
- The dataset has around 200k training examples …
case deeplearning faster gpus hello normally rtx rtx 3060 training unet
More from www.reddit.com / Deep Learning
A Visual Guide to GNN Sampling using PyTorch Geometric
2 days, 2 hours ago |
www.reddit.com
How can a transformer be equivariant?
3 days, 1 hour ago |
www.reddit.com
4060 ti 16gb or 4070 super 12gb?
3 days, 7 hours ago |
www.reddit.com
Is it possible to do "surgery" on a trained dataset for generative AI?
3 days, 11 hours ago |
www.reddit.com
Thoughts on New Transformer Stacking Paper
3 days, 21 hours ago |
www.reddit.com
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer
@ GPTZero | Toronto, Canada
ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)
@ HelloBetter | Remote
Doctoral Researcher (m/f/div) in Automated Processing of Bioimages
@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena
Seeking Developers and Engineers for AI T-Shirt Generator Project
@ Chevon Hicks | Remote
Senior Applied Data Scientist
@ dunnhumby | London
Principal Data Architect - Azure & Big Data
@ MGM Resorts International | Home Office - US, NV