all AI news
A Shakeout is Coming for Inference Startups
The Information www.theinformation.com
If you tuned into Nvidia’s and Microsoft’s recent quarterly earnings calls, you probably heard their leaders discuss how inference—a fancy word for operating an artificial intelligence model—is becoming a bigger part of the computing workloads in their data centers. In other words, chatbots like ChatGPT Enterprise are serving real customers. That contrasts with what was going on for much of last year, when major tech firms were developing, or training, the models, which also uses a lot of computing capacity. …
artificial artificial intelligence bigger chatbots chatgpt chatgpt enterprise computing customers data data centers discuss earnings enterprise inference intelligence leaders microsoft nvidia part startups word words workloads