Dec. 6, 2023, 9:08 p.m. | Emilia David

The Verge - All Posts www.theverge.com


Image: PaulSakuma.com


AMD wants people to remember that Nvidia’s not the only company selling AI chips. It’s announced the availability of new accelerators and processors geared toward running large language models, or LLMs.


The chipmaker unveiled the Instinct MI300X accelerator and the Instinct M1300A accelerated processing unit (APU), which the company said works to train and run LLMs. The company said the MI300X has 1.5 times more memory capacity than the previous M1250X version. Both new products have better …

accelerated processing unit accelerator accelerators ai chips ai training amd apu availability chipmaker chips faster image language language models large language large language models llms mi300x nvidia people power processing processors releases running selling the company training

Senior Machine Learning Engineer

@ GPTZero | Toronto, Canada

ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)

@ HelloBetter | Remote

Doctoral Researcher (m/f/div) in Automated Processing of Bioimages

@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena

Seeking Developers and Engineers for AI T-Shirt Generator Project

@ Chevon Hicks | Remote

Principal Data Architect - Azure & Big Data

@ MGM Resorts International | Home Office - US, NV

GN SONG MT Market Research Data Analyst 11

@ Accenture | Bengaluru, BDC7A