May 13, 2023, 4:01 p.m. | /u/gabrielesilinic

Deep Learning www.reddit.com

First of all, why? Well, look at the price of an A100 GPU and you will understand, the insane advantage of running large models on an integrated graphics card is that, first of all: they should be able to run there.

Why? Well, I just upgraded my laptop and now has 32 GB of RAM, the integrated GPU can share those 32GB of system memory with ease and make it its VRAM, so even if it will not run as …

a100 a100 gpu card deeplearning gpu graphics graphics card integrated graphics iris xe large models look memory price pytorch running tensorflow

Senior Machine Learning Engineer

@ GPTZero | Toronto, Canada

ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)

@ HelloBetter | Remote

Doctoral Researcher (m/f/div) in Automated Processing of Bioimages

@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena

Seeking Developers and Engineers for AI T-Shirt Generator Project

@ Chevon Hicks | Remote

Real World Evidence Research Analyst

@ Novartis | Dublin (Novartis Global Service Center (NGSC))

Senior DataOps Engineer

@ Winterthur Gas & Diesel AG | Winterthur, CH