Nov. 7, 2023, 4:37 p.m. | Kunal Kejriwal

Unite.AI www.unite.ai

The GLM-130B framework is a bilingual pre-trained large language model with over 130 billion parameters capable of generating text outputs in both English and Chinese. The GLM-130B framework is an attempt to open source a language model at a scale of over 100B parameters, and discuss how frameworks of such a large scale can be […]


The post GLM-130B: An Open Bilingual Pre-Trained Model appeared first on Unite.AI.

artificial intelligence bilingual billion chinese discuss english framework frameworks language language model large language large language model llm open source parameters scale text

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US