all AI news
GLM-130B: An Open Bilingual Pre-Trained Model
Unite.AI www.unite.ai
The GLM-130B framework is a bilingual pre-trained large language model with over 130 billion parameters capable of generating text outputs in both English and Chinese. The GLM-130B framework is an attempt to open source a language model at a scale of over 100B parameters, and discuss how frameworks of such a large scale can be […]
The post GLM-130B: An Open Bilingual Pre-Trained Model appeared first on Unite.AI.
artificial intelligence bilingual billion chinese discuss english framework frameworks language language model large language large language model llm open source parameters scale text