-
Notifications
You must be signed in to change notification settings - Fork 14.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Dataset Next Trigger Modal Not Populating Latest Update #26892
Comments
Just to expand upon the state that is causing this issue a bit more... The DatasetEvent model looks correct in the database: Unsure why this query isn't populating the Lines 3447 to 3451 in 8898db9
|
@tseruga could you share the Lines 3461 to 3468 in 8898db9
|
Feel free to assign this to me. I agree this should be more robust and clear in the UI |
Apache Airflow version
2.4.1
What happened
When using dataset scheduling, it isn't obvious which datasets a downstream dataset consumer is awaiting in order for the DAG to be scheduled.
I would assume that this is supposed to be solved by the
Latest Update
column in the modal that opens when selectingx of y datasets updated
, but it appears that the data isn't being populated.Although one of the datasets has been produced, there is no data in the
Latest Update
column of the modal.In the above example, both datasets have been produced > 1 time.
What you think should happen instead
The
Latest Update
column should be populated with the latest update timestamp for each dataset required to schedule a downstream, dataset consuming DAG.Ideally there would be some form of highlighting on the "missing" datasets for quick visual feedback when DAGs have a large number of datasets required for scheduling.
How to reproduce
x of y datasets updated
button.Operating System
Debian GNU/Linux 11 (bullseye)
Versions of Apache Airflow Providers
No response
Deployment
Docker-Compose
Deployment details
No response
Anything else
No response
Are you willing to submit PR?
Code of Conduct
The text was updated successfully, but these errors were encountered: