April 23, 2023, 5:35 p.m. | /u/Mbando

Natural Language Processing www.reddit.com

To the best of my understanding, if I want to use Alpaca or Vicuna variants, I need to get the original LLaMa weights, and for whatever reason FB is blocking anyone from my institution from access. Given that, what's the best alternative in the 7-13b range for instruct-trained models?

alpaca best of blocking languagetechnology llama understanding variants vicuna

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne