Sept. 1, 2023, 2:40 a.m. | Chaim Rand

Towards Data Science - Medium towardsdatascience.com

Yet another money-saving AI-model training hack

Photo by Lisheng Chang on Unsplash

The topic of this post is AWS’s home-grown AI chip, AWS Inferentia — more specifically, the second-generation AWS Inferentia2. This is a sequel to our post from last year on AWS Trainium and joins a series of posts on the topic of dedicated AI accelerators. Contrary to the chips we have explored in our previous posts in the series, AWS Inferentia was designed for AI model inference and …

ai chip artificial intelligence aws aws inferentia chip cloud-ml-engine deep learning deep learning training home inferentia joins machine learning money saving series training trainium

Senior AI/ML Developer

@ Lemon.io | Remote

Earthquake Forecasting Post-doc in ML at the USGS

@ U. S. Geological Survey | Remote, US

Senior Data Scientist - Remote - Colombia

@ FullStack Labs | Soacha, Cundinamarca, Colombia

Senior Data Engineer

@ Reorg | Remote - US

Quantitative / Data Analyst

@ Talan | London, United Kingdom

Senior Data Scientist

@ SoFi | CA - San Francisco; US - Remote