Sept. 12, 2023, 5:17 a.m. | /u/PantsuWitch

Machine Learning www.reddit.com

[Arxiv link: Textbooks are all you need II](https://arxiv.org/abs/2309.05463)

> More generally, **phi-1.5** (1.3B) exhibits many of the traits of much larger LLMs, both good – such as the ability to "think step by step" or perform some rudimentary in-context learning – and bad, including hallucinations and the potential for toxic and biased generations – encouragingly though, we are seeing improvement on that front thanks to the absence of web data. We open-source **phi-1.5** to promote further research on these urgent …

context good hallucinations improvement in-context learning llms machinelearning think

Senior Machine Learning Engineer

@ Kintsugi | remote

Staff Machine Learning Engineer (Tech Lead)

@ Kintsugi | Remote

R_00029290 Lead Data Modeler – Remote

@ University at Buffalo | Austin, TX

R_00029290 Lead Data Modeler – Remote

@ University of Texas at Austin | Austin, TX

Senior AI/ML Developer

@ Lemon.io | Remote

Senior Data Science Consultant

@ Sia Partners | Amsterdam, Netherlands