Sept. 21, 2023, 9 a.m. |

InfoWorld Machine Learning www.infoworld.com



In response to growing demand for generative AI applications and large language models (LLM), Oracle Cloud Infrastructure (OCI) has made Nvidia H100 Tensor Core GPUs available on the OCI Compute platform. Nvidia L40S GPUs also will be coming to the platform soon.

Oracle said OCI Compute now offers bare-metal instances with Nvidia H100 GPUs, powered by the Nvidia Hopper architecture for AI, thus enabling an “order-of-magnitude performance leap” for large-scale AI and high-performance computing applications. The Nvidia H100 GPU …

ai applications applications artificial intelligence cloud cloud computing cloud infrastructure compute core demand generative generative-ai generative ai applications gpus h100 infrastructure instances language language models large language large language models llm metal nvidia nvidia h100 oci oracle oracle cloud oracle cloud infrastructure platform tensor tensor core gpus

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote