Feb. 22, 2024, 1:35 p.m. | /u/WritingBeginning3403

Machine Learning www.reddit.com

Hello everyone, I know that we all have seen the Gemini v1.5 model with 1 million context and also the hardware from company called groq showed that if the hardware is designed specifically for Language models in mind they can get much better. What do you think about RAG architectures now as we have seen very long context model. What if we have much more long context models with better quantization techniques and hardware?? Do you think architecture like RAGs …

architectures context gemini groq hardware hello language language models machinelearning mind rag think

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US