Sept. 1, 2023, 2:40 a.m. | Chaim Rand

Towards Data Science - Medium towardsdatascience.com

Yet another money-saving AI-model training hack

Photo by Lisheng Chang on Unsplash

The topic of this post is AWS’s home-grown AI chip, AWS Inferentia — more specifically, the second-generation AWS Inferentia2. This is a sequel to our post from last year on AWS Trainium and joins a series of posts on the topic of dedicated AI accelerators. Contrary to the chips we have explored in our previous posts in the series, AWS Inferentia was designed for AI model inference and …

ai chip artificial intelligence aws aws inferentia chip deep learning deep learning training home inferentia joins machine learning money saving series training trainium

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Machine Learning Engineer

@ Samsara | Canada - Remote