Feb. 17, 2024, 12:04 a.m. | /u/lildaemon

Machine Learning www.reddit.com

Renting a dedicated server with GPU support can be expensive, especially when the model has billions of parameters. According to my calculations, using something like AWS, it comes out to about $20k per year -- that's assuming $2 to $3 per hour for the server. I have some models that I am training that I would like to use in web apps. If the web apps are successful, then that $20k is well spent, but if they are not, then …

aws costs gpu hour machinelearning parameters per renting server something support

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York