June 23, 2023, 2:09 p.m. | /u/phas0ruk1

Data Science www.reddit.com

I want to use a chat LLM on my website. I’m a full stack dev. I’m confused about the AI stack.

From my front end, where do I send the API request to the llm ? Can I Host the model on Hugging Face and api in, or do I need to host it elsewhere (presumably I do) with a gpu cloud provider like vast.ai?

ai stack api chat datascience dev face hugging face llm stack website

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US