April 19, 2022, 4:03 p.m. | Ed Shee

Towards AI - Medium pub.towardsai.net

Photo by Marius Masalar on Unsplash

Ever trained a new model and just wanted to use it through an API straight away? Sometimes you don’t want to bother writing Flask code or containerizing your model and running it in Docker. If that sounds like you, you definitely want to check out MLServer. It’s a python-based inference server that recently went GA and what’s really neat about it is that it’s a highly-performant server designed for production environments too. That …

artificial intelligence data science learning machine machine learning machine learning models ml-model-deployment python

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Sr. BI Analyst

@ AkzoNobel | Pune, IN