July 23, 2022, 1:38 p.m. | /u/MrAcurite

Machine Learning www.reddit.com

I just built myself a new machine with an RTX 3090, and have been training some models.

When I place no limits on the GPU, it consumes ~350 watts, averages ~80% utilization, and completes an epoch for the model and dataset I'm using in 50 seconds.

When I limit the GPU to 1500 MHz, it consumes ~220 watts, averages ~90% utilization, and completes an epoch for the same model and the same dataset in 54 seconds. So I save more …

clock speeds gpu machinelearning power power consumption

Senior Machine Learning Engineer

@ GPTZero | Toronto, Canada

ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)

@ HelloBetter | Remote

Doctoral Researcher (m/f/div) in Automated Processing of Bioimages

@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena

Seeking Developers and Engineers for AI T-Shirt Generator Project

@ Chevon Hicks | Remote

Senior Applied Data Scientist

@ dunnhumby | London

Principal Data Architect - Azure & Big Data

@ MGM Resorts International | Home Office - US, NV