April 28, 2023, 1:11 a.m. | /u/InfinitePerplexity99

Machine Learning www.reddit.com

Let's assume the following:

\- I'm willing to buy an NVIDIA RTX 3090 or similar card.

\- I already have an NVIDIA GTX 1650 that currently works just fine for gaming and graphics.

\- I'm willing to buy RAM as necessary.

\- I don't know much about hardware.

\- I do know a fair amount about language models.

Is it feasible to kick off a language modeling job on the RTX 3090 - something like running inference with LLAMA or …

card computer fair gaming graphics hardware inference job language language models llama machinelearning modeling nvidia nvidia rtx rtx rtx 3090 running scale

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Data Analyst

@ SEAKR Engineering | Englewood, CO, United States

Data Analyst II

@ Postman | Bengaluru, India

Data Architect

@ FORSEVEN | Warwick, GB

Director, Data Science

@ Visa | Washington, DC, United States

Senior Manager, Data Science - Emerging ML

@ Capital One | McLean, VA