Oct. 17, 2023, 5:36 a.m. | /u/dec_dev

Data Science www.reddit.com

Hi r/datascience,

From my experience working with data orchestration tools (Airflow primarily), I tend to deal with a lot of repetitive fixes with flaky pipelines such as resource exhaustion issues, single malformed entries or other edge cases, figuring out why a task isn't running, and so on. I was wondering whether any of you had the same experience in your day-to-day work. How much of the job is actually just dealing with repetitive issues and maintenance of pipelines, and do …

airflow cases data datascience deal edge experience orchestration pipeline pipelines running tools

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US