May 22, 2023, 11:09 a.m. | /u/capital-man

Machine Learning www.reddit.com

Zicklein is a German version of Alpaca 7b fine-tuned using the LoRA method, trained using a German translated version of the cleaned Alpaca instruct dataset.

Github: [https://github.com/avocardio/zicklein](https://github.com/avocardio/zicklein)

HuggingFace: [https://huggingface.co/avocardio/alpaca-lora-7b-german-base-52k](https://huggingface.co/avocardio/alpaca-lora-7b-german-base-52k)

You can also try it out [here](https://huggingface.co/spaces/avocardio/German-Alpaca-LoRA-7b) (although its super slow - running on a CPU, responses take around 130s).

alpaca dataset german llama lora machinelearning project translated

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne