April 23, 2023, 5:35 p.m. | /u/Mbando

Natural Language Processing www.reddit.com

To the best of my understanding, if I want to use Alpaca or Vicuna variants, I need to get the original LLaMa weights, and for whatever reason FB is blocking anyone from my institution from access. Given that, what's the best alternative in the 7-13b range for instruct-trained models?

alpaca best of blocking languagetechnology llama understanding variants vicuna

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US