Dec. 19, 2023, 8:20 a.m. | Maxim Saplin

DEV Community dev.to

Phi-2, an open-source model by Microsoft promises to be equal or even beat Llama-2 70B. All of that while being super small in size (just 2.7B parameters). The secret sauce being the low quantity high quality 'text book' synthetic data.


What interests me is that the model is relatively easy to run locally on almost any consumer hardware. You can get one and talk to it through LM Studio - I have downloaded and used 'TheBloke/phi-2-GGUF/phi-2.Q4_K_M.gguf' version of the model. …

ai beta book chat chatgpt data easy llama llm low machinelearning microsoft parameters phi phi-2 quality secret secret sauce small studio synthetic synthetic data text through

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US