Oct. 23, 2022, 1:25 p.m. | /u/nacho_biznis

Data Science www.reddit.com

Hello, all! This is more of a question for the Data Engineers here.

I am new to Airflow and Docker. I am trying to do something which seems relatively simple.

I already have an ETL job in Pyspark which can be run with command line arguments 'python [main.py](https://main.py)' style. I understand how to write the dag file under dags/ folder. But, I have a few issues:

1) I don't want to run a method with arguments. I want to run …

airflow datascience

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Robotics Technician - 3rd Shift

@ GXO Logistics | Perris, CA, US, 92571