Dec. 23, 2023, 10:39 a.m. | Towards AI Editorial Team

Towards AI - Medium pub.towardsai.net

Demand for building products with Large Language Models has surged since the launch of ChatGPT. This has caused massive growth in the computer needs for training and running models (inference). Nvidia GPUs dominate market share, particularly with their A100 and H100 chips, but AMD has also grown its GPU offering, and companies like Google have built custom AI chips in-house (TPUs). Nvidia data center revenue (predominantly sale of GPUs for LLM use cases) grew 279% yearly in 3Q of 2023 …

a100 ai amd aritificial-intelligence building chatgpt chips cloud cloud computing cloud providers computer demand fine-tuning gpu gpus growth h100 h100 chips inference language language models large language large language models launch llm market share massive nvidia nvidia gpus products running towards-ai training

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Data Engineer - Takealot Group (Takealot.com | Superbalist.com | Mr D Food)

@ takealot.com | Cape Town