all AI news
NVIDIA's Eos supercomputer just broke its own AI training benchmark record
Engadget www.engadget.com
Depending on the hardware you're using, training a large language model of any significant size can take weeks, months, even years to complete. That's no way to do business — nobody has the electricity and time to be waiting that long. On Wednesday, NVIDIA unveiled the newest iteration of its Eos supercomputer, one powered by more than 10,000 H100 Tensor Core GPUs and capable of training a 175 billion-parameter GPT-3 model on 1 billion tokens in under four minutes. That's …
ai training benchmark business electricity hardware iteration language language model large language large language model nvidia supercomputer training waiting wednesday