admin管理员组文章数量:1421688
I have an Airflow DAG with six tasks, all using GKEPodOperator. My DAG has an execution_timeout of 1 hour, but sometimes Task 5 takes longer to execute, causing failures.
I want to set a dynamic execution_timeout for Task 5 using the following logic: • DAG Start Time + 60 minutes (total execution window) • Subtract the end time of Task 4 (the task prior to Task 5) • Leave a 2-minute buffer for Task 6 to complete
I tried retrieving the DAG start time and Task 4’s end time using XCom and dag_run.get_task_instance(), but it is failing inside GKEPodOperator.
Is there a way to dynamically compute and apply execution_timeout within Task 5 itself, without creating an extra task?
Any suggestions on achieving this in Airflow?
本文标签: pythonHow to Set Dynamic executiontimeout for a Task in Airflow Using GKEPodOperatorStack Overflow
版权声明:本文标题:python - How to Set Dynamic execution_timeout for a Task in Airflow Using GKEPodOperator? - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1745308084a2652772.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论