Jan. 19, 2024, 9:53 a.m. | /u/blooming17

Deep Learning www.reddit.com

It's GTX 1650 ti not 1060 ti sorry

Hello, I am training a UNet with atrous 2d convolutions, I am experimenting with various GPUs and noticed that RTX 3060 Ti is taking way much time in training compared to GTX 1650 Ti which normally shouldn't be the case.

Has anyone encountered the same problem ? What do you think the problem might be ?

PS: The model has around 2.3 million parameters.
- The dataset has around 200k training examples …

case deeplearning faster gpus hello normally rtx rtx 3060 training unet

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Sr. VBI Developer II

@ Atos | Texas, US, 75093

Wealth Management - Data Analytics Intern/Co-op Fall 2024

@ Scotiabank | Toronto, ON, CA