Aug. 21, 2022, 9 p.m. | /u/paulcjh

Machine Learning www.reddit.com

I wrote a post on how to virtualise GPUs and attach them to VMs for on-prem workloads. The VMs can then be attached to Juju & K8S for load balancing or whatever you want. I implemented this where I work and it runs all of our ML compute, it was a pain to get working originally hope that you find it useful:

[https://www.paulcjh.com/technical\_posts/gpu\_virtualisation.html](https://www.paulcjh.com/technical_posts/gpu_virtualisation.html)

compute gpu k8s machinelearning ml on-prem

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote