July 28, 2023, 3:22 p.m. | Craig S. Smith

IEEE Spectrum spectrum.ieee.org



Who said all large-language models (LLMs) necessarily need to be large? In China’s case, LLMs are currently downsizing in their size and number of parameters. According to sources, this is because the country is now focusing on enabling Chinese startups and smaller entities to build their own generative AI applications. As part of this downscaling trend, in June the Beijing Academy of Artificial Intelligence (BAAI) introduced Wu Dao 3.0, a series of open-source, LLMs.

Based on interviews with high-ranking, anonymous …

ai applications applications baidu build case chatgpt china chinese country enabling generative generative-ai generative ai applications gpt huawei language language models large language models llms startups

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

DevOps Engineer (Data Team)

@ Reward Gateway | Sofia/Plovdiv