Nov. 13, 2023, 7:47 p.m. | Dr. Tony Hoang

The Artificial Intelligence Podcast linktr.ee

Nvidia has unveiled the H200, a state-of-the-art GPU designed to train and deploy elaborate AI models. The chip boasts next-gen HBM3 memory and will deliver output at nearly double the speed of its predecessor, H100. Nvidia's stocks have soared thanks to the popular demand for AI GPUs, with the firm expecting $16 billion in revenue this quarter. The high-end H200 is expected to hit the market in Q2 of 2024.



---

Send in a voice message: https://podcasters.spotify.com/pod/show/tonyphoang/message
Support this podcast: …

ai models art chip demand deploy gen gpu gpus h100 hbm3 memory next next-gen nvidia popular speed state stocks train training training ai training ai models

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Data Engineer (m/f/d)

@ Project A Ventures | Berlin, Germany

Principle Research Scientist

@ Analog Devices | US, MA, Boston