Oct. 23, 2022, 1:25 p.m. | /u/nacho_biznis

Data Science www.reddit.com

Hello, all! This is more of a question for the Data Engineers here.

I am new to Airflow and Docker. I am trying to do something which seems relatively simple.

I already have an ETL job in Pyspark which can be run with command line arguments 'python [main.py](https://main.py)' style. I understand how to write the dag file under dags/ folder. But, I have a few issues:

1) I don't want to run a method with arguments. I want to run …

airflow datascience

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US