March 15, 2022, 5:17 p.m. | /u/cavedave

Machine Learning www.reddit.com

Come ask questions about everything on the new HuggingFace [BigScience language model](https://bigscience.huggingface.co/blog/what-language-model-to-train-if-you-have-two-million-gpu-hours), the [dataset](https://bigscience.huggingface.co/blog/building-a-tb-scale-multilingual-dataset-for-language-modeling), the licences, the [cluster](https://bigscience.huggingface.co/blog/which-hardware-to-train-a-176b-parameters-model)!


[BigScience](https://bigscience.huggingface.co/) have just started training a massive-scale multilingual language model on the French supercomputer Jean Zay with [BigScience](https://bigscience.huggingface.co/) – literally out in the open. This is not only the first time a multilingual LLM (46 languages!) at this scale will be fully accessible to the ML research community, but the whole decision, engineering and training process is transparent and open. We'll be …

announcement bigscience huggingface machinelearning

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne