Replies: 1 comment
-
|
Yes, it is expected. You should use Dag Versioning - for example with GitDagBundle and make sure that this function is in the git repo synced by GitDagBundle - then every time the function changes (new git commit) - even if serialized Dag will be the same - it will create a new version of the Dag - and then every parsing and execution will be associated with the commit version it refers to. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
We use git sync and KubernetesPodOperator.
We have created a function that will return the
KubernetesPodOperatororDockerOperatorthis allows local development to run the DAG locally for testing and it will switch the DockerOperator.This function is in it's own file so it can be imported for every dag.
I have noticed that if there is only a change in the function Airflow or dag processor doesn't see that as a change to the dag itself. Which means there is not a new version imported and this means changes do not get tested or used.
It does however use the new code on dag run. It's very hard to identify if/when it changed.
Is this expected behavior? Is there a work around, other then modifying every dag file every time the function is updated?
We are using
apache/airflow:3.0.2via the current helm chart version1.18.0.Beta Was this translation helpful? Give feedback.
All reactions