all AI news
Microsoft Announces Small Language Model Phi-2
InfoQ - AI, ML & Data Engineering www.infoq.com
Microsoft Research announced Phi-2, a 2.7 billion-parameter Transformer-based language model. Phi-2 is trained on 1.4T tokens of synthetic data generated by GPT-3.5 and outperforms larger models on a variety of benchmarks.
By Anthony Alfordai anthony benchmarks billion data deep learning generated generative-ai gpt gpt-3 gpt-3.5 language language model large language models larger models microsoft microsoft research ml & data engineering neural networks phi phi-2 research small synthetic synthetic data tokens transformer