Jan. 19, 2024, 9:53 a.m. | /u/blooming17

Deep Learning www.reddit.com

It's GTX 1650 ti not 1060 ti sorry

Hello, I am training a UNet with atrous 2d convolutions, I am experimenting with various GPUs and noticed that RTX 3060 Ti is taking way much time in training compared to GTX 1650 Ti which normally shouldn't be the case.

Has anyone encountered the same problem ? What do you think the problem might be ?

PS: The model has around 2.3 million parameters.
- The dataset has around 200k training examples …

case deeplearning faster gpus hello normally rtx rtx 3060 training unet

Senior Machine Learning Engineer

@ GPTZero | Toronto, Canada

ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)

@ HelloBetter | Remote

Doctoral Researcher (m/f/div) in Automated Processing of Bioimages

@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena

Seeking Developers and Engineers for AI T-Shirt Generator Project

@ Chevon Hicks | Remote

Senior Applied Data Scientist

@ dunnhumby | London

Principal Data Architect - Azure & Big Data

@ MGM Resorts International | Home Office - US, NV