Oct. 5, 2023, 2:04 p.m. | /u/EatTFM

Machine Learning www.reddit.com

Hi,

our company plans for some budget in 2024 to invest into hardware to do the following

\- running local LLMs for our coworkers to interfere with an locally running offline GPT alike ChatGPT. Use cases:

* generating templates for email, letters etc
* Translation (EN/GER/FR/SPA)
* Querying internal knowledge bases and/or FAQs/HOWTOs

I did some research but it is still hard for me to estimate what are the HW / AI skill requirements to implement something not a quarter …

budget cases chatgpt email etc gpt hardware knowledge llms machinelearning offline opensource opensource gpt requirements running spa translation use cases

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote