Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

datasets, next_run_datasets, remove unnecessary timestamp filter #29441

Merged
merged 2 commits into from
Feb 20, 2023

Conversation

michaelmicheal
Copy link
Contributor

Closes: #26892
I'm not sure what the intention of this filter was:
DatasetEvent.timestamp > DatasetDagRunQueue.created_at
When a dataset is updated,

  1. A DatasetEvent is created (and saved)
  2. And then, a DatasetDagRunQueue row is added for each dependant DAG.

I don't think we need this filter, and have confirmed that it prevents the last update from being shown.
By removing this filter, it resolves #26892


^ Add meaningful description above

Read the Pull Request Guidelines for more information.
In case of fundamental code changes, an Airflow Improvement Proposal (AIP) is needed.
In case of a new dependency, check compliance with the ASF 3rd Party License Policy.
In case of backwards incompatible changes please leave a note in a newsfragment file, named {pr_number}.significant.rst or {issue_number}.significant.rst, in newsfragments.

@boring-cyborg boring-cyborg bot added the area:webserver Webserver related Issues label Feb 9, 2023
@@ -3715,7 +3715,6 @@ def next_run_datasets(self, dag_id):
DatasetEvent,
and_(
DatasetEvent.dataset_id == DatasetModel.id,
DatasetEvent.timestamp > DatasetDagRunQueue.created_at,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@blag Does this look right to you? re: #26356

Copy link
Contributor

@blag blag Feb 9, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

😬 It's been a long minute since I wrote this, but...

I believe when I wrote this the intent with the lastUpdate field was to only show last updates since the last time the dag was queued/run. But yeah, the lastUpdate label isn't descriptive enough for that.

Option 1: Personally, I would consider changing the name of what the lastUpdate field is rendered to, something like "Last update since last run" or something more wordy.

Option 2: But if you don't want to do that, and you want to display the last update for every dataset regardless of whether has already been "consumed" by a DagRun (eg: in either the DatasetDagRunQueue or actually scheduled into a DagRun), then yeah it makes sense to remove this filter. However, I would also remove the and_ around it since then there would only be one filter condition in that join:

                .join(
                    DatasetEvent,
                    DatasetEvent.dataset_id == DatasetModel.id,
                    isouter=True,
                )

If you go for option 2, I think you should be able to compare the existence and creation time of the DDRQ with the DatasetEvent timestamp to figure out whether or not the last update time has already triggered a DDRQ/DagRun or if it has partially satisfied the conditions of a future DagRun.

Hope this makes sense.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

However, I would also remove the and_ around it since then there would only be one filter condition in that join:

Yes, you're right the and becomes unnecessary.

I think there might be some confusion around DDRQ. My understanding is that when a DatasetEvent is created, a DDRQ record is created per consuming DAG. Then, once a DAG has an associated DDRQ record for each Dataset that it depends on, a dag_run is created and then all DDRQ records associated with that DAG are deleted.

If you go for option 2, I think you should be able to compare the existence and creation time of the DDRQ with the DatasetEvent timestamp to figure out whether or not the last update time has already triggered a DDRQ/DagRun or if it has partially satisfied the conditions of a future DagRun.

As I understand it, if there are DDRQ records for a DAG, we can assume that there hasn't been a DagRun triggered since the last DatasetEvent (because we delete DDRQ records on the creation of a DagRun).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@blag Does that make sense to you?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just dug through the code in more detail, and yes, your understanding seems to be correct. 😄

@michaelmicheal
Copy link
Contributor Author

@ephraimbuddy @ashb Is this good to merge?

@michaelmicheal
Copy link
Contributor Author

@potiuk is this good to merge?

@potiuk potiuk merged commit 6f9efbd into apache:main Feb 20, 2023
@pierrejeambrun pierrejeambrun added this to the Airflow 2.5.2 milestone Feb 27, 2023
@pierrejeambrun pierrejeambrun added the type:bug-fix Changelog: Bug Fixes label Feb 27, 2023
pierrejeambrun pushed a commit that referenced this pull request Mar 7, 2023
)

* datasets, next_run_datasets, remove unnecessary timestamp filter

* remove redundant `_and`

(cherry picked from commit 6f9efbd)
pierrejeambrun pushed a commit that referenced this pull request Mar 8, 2023
)

* datasets, next_run_datasets, remove unnecessary timestamp filter

* remove redundant `_and`

(cherry picked from commit 6f9efbd)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:webserver Webserver related Issues type:bug-fix Changelog: Bug Fixes
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Dataset Next Trigger Modal Not Populating Latest Update
6 participants