Aug. 10, 2022, 1:34 p.m. | Chris Young

Towards Data Science - Medium towardsdatascience.com

Leveraging GitHub actions and Astronomer to quickly push code updates to a production environment

Photo by Mike Benna on Unsplash

Apache Airflow is a popular data orchestration tool used to manage workflows and tasks. However, one of the big questions I continue to come up against is how to deploy production-ready instances of Airflow. Options for hosting Airflow include self-management on a virtual machine, deploying to the cloud-based platform Astronomer, leveraging AWS MWAA, and more.

Of these options, I have …

airflow cd ci-cd-pipeline data engineering github pipeline python

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US