Aug. 10, 2022, 1:34 p.m. | Chris Young

Towards Data Science - Medium towardsdatascience.com

Leveraging GitHub actions and Astronomer to quickly push code updates to a production environment

Photo by Mike Benna on Unsplash

Apache Airflow is a popular data orchestration tool used to manage workflows and tasks. However, one of the big questions I continue to come up against is how to deploy production-ready instances of Airflow. Options for hosting Airflow include self-management on a virtual machine, deploying to the cloud-based platform Astronomer, leveraging AWS MWAA, and more.

Of these options, I have …

airflow cd ci-cd-pipeline data engineering github pipeline python

Data Scientist (m/f/x/d)

@ Symanto Research GmbH & Co. KG | Spain, Germany

Machine Learning Operations (MLOps) Engineer - Advisor

@ Peraton | Fort Lewis, WA, United States

Mid +/Senior Data Engineer (AWS/GCP)

@ Capco | Poland

Senior Software Engineer (ETL and Azure Databricks)|| RR/463/2024 || 4 - 7 Years

@ Emids | Bengaluru, India

Senior Data Scientist (H/F)

@ Business & Decision | Toulouse, France

Senior Analytics Engineer

@ Algolia | Paris, France