Sept. 9, 2022, 7:16 a.m. | Shashank Prasanna

Towards Data Science - Medium towardsdatascience.com

Learn about how Docker simplifies access to NVIDIA GPUs, AWS Inferentia and scaling ML containers on Kubernetes

illustration by author

If you told me a few years ago that data scientists would be using Docker containers in their day to day work, I wouldn’t have believed you. As a member of the broader machine learning (ML) community I always considered Docker, Kubernetes, Swarm (remember that?) exotic infrastructure tools for IT/Ops experts. Today it’s a different story, rarely a day goes …

ai accelerators aws aws inferentia data science deep-dives docker gpus hardware kubernetes machine machine learning nvidia nvidia gpus

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US