Dec. 19, 2023, 2 p.m. | Anthony Alford

InfoQ - AI, ML & Data Engineering www.infoq.com

Microsoft Research announced Phi-2, a 2.7 billion-parameter Transformer-based language model. Phi-2 is trained on 1.4T tokens of synthetic data generated by GPT-3.5 and outperforms larger models on a variety of benchmarks.

By Anthony Alford

ai anthony benchmarks billion data deep learning generated generative-ai gpt gpt-3 gpt-3.5 language language model large language models larger models microsoft microsoft research ml & data engineering neural networks phi phi-2 research small synthetic synthetic data tokens transformer

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US