You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In #3054 we introduced the basic functionality of regression testing. This ticket aims to track the next stage of development with the following goals:
developers are notified internally via slack when a nightly run surpasses some threshold.
performance measurements are directly comparable to product goals
performance measurements for released versions are locked in at release time
measure performance metrics for compilation as well as parsing
Additional context
#4012 is a perfect example of a performance regression we would like to catch in development, but would require adding compilation to our measured metrics.
Implementation
There are two stages of implementation. Data collection should be done first so that we have more data to work with in the notification stage. Adding data collection for compilation performance requires adding a database to the tests which adds a bit of necessary complexity.
Data collection
Make sure the generated artifacts contain all the fields we would like to track
timestamp
total parse time
baseline & dev stddev parse value
baseline & dev median parse value
baseline & dev stddev compile value
baseline & dev median compile value
ms / file or number of files in project
branch names
raw runs (in case we want to do our own stats later in dbt)
Ship this data to snowflake
Notification
Remove regression detection from rust runner (this logic will live in dbt)
Decide thresholds for notifications
Implement an actual notification via failing dbt test or looker
Add performance regression watermarking to the release process
The text was updated successfully, but these errors were encountered:
Description
In #3054 we introduced the basic functionality of regression testing. This ticket aims to track the next stage of development with the following goals:
Additional context
#4012 is a perfect example of a performance regression we would like to catch in development, but would require adding compilation to our measured metrics.
Implementation
There are two stages of implementation. Data collection should be done first so that we have more data to work with in the notification stage. Adding data collection for compilation performance requires adding a database to the tests which adds a bit of necessary complexity.
The text was updated successfully, but these errors were encountered: