Nov. 12, 2023, 3 p.m. | Samuel K. Moore

IEEE Spectrum spectrum.ieee.org



The leading public apples-to-apples test for computer systems’ ability to train machine learning neural networks has fully entered the generative AI era. Earlier this year, MLPerf added a test for training large language models (LLM), GPT-3 in particular. This month it adds Stable Diffusion, a text-to-image generator. Computers powered by Intel and Nvidia took on the new benchmark. And the rivals continued their earlier battle in training GPT-3, where they were joined this go-around by Google.

All three devoted …

ai training computer computers computer systems diffusion generative generative-ai generative ai training generator google gpt gpt-3 image image generator intel language language models large language large language models llm machine machine learning mlperf networks neural networks nvidia public stable diffusion supercomputing systems test text text-to-image text-to-image generator train training

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne