June 27, 2023, 7:48 p.m. | Aitrepreneur

Aitrepreneur www.youtube.com

ExLLAMA is a real breakthrough in the LLM community! This innovative update for the text-generation LLM webui not only can increase the TOKENS capacity of a LLAMA model to 8K+, it also significantly REDUCES the VRAM usage, and gives a MAJOR SPEED BOOST to text generation and in this video, I'll show you how to use it.

What do you think of ExLLAMA? Let me know in the comments!
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
SOCIAL MEDIA LINKS!
✨ Support my work on Patreon: https://www.patreon.com/aitrepreneur …

boost capacity community llama llm major speed text text generation tokens usage video

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US