-
Notifications
You must be signed in to change notification settings - Fork 867
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: change logic of analysis run to better handle errors #2695
Conversation
Codecov ReportPatch coverage:
Additional details and impacted files@@ Coverage Diff @@
## master #2695 +/- ##
==========================================
+ Coverage 81.47% 81.62% +0.14%
==========================================
Files 133 133
Lines 20154 20152 -2
==========================================
+ Hits 16421 16449 +28
+ Misses 2881 2849 -32
- Partials 852 854 +2
☔ View full report in Codecov by Sentry. |
Signed-off-by: zachaller <[email protected]>
Signed-off-by: zachaller <[email protected]>
Signed-off-by: zachaller <[email protected]>
Signed-off-by: zachaller <[email protected]>
Signed-off-by: zachaller <[email protected]>
Signed-off-by: zachaller <[email protected]>
Signed-off-by: zachaller <[email protected]>
8f7fbd9
to
35e21aa
Compare
Kudos, SonarCloud Quality Gate passed! 0 Bugs No Coverage information |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pls check my comments
if t.incompleteMeasurement != nil { | ||
newMeasurement = *t.incompleteMeasurement | ||
} else { | ||
startedAt := timeutil.MetaNow() | ||
newMeasurement.StartedAt = &startedAt | ||
} | ||
newMeasurement.Phase = v1alpha1.AnalysisPhaseError | ||
newMeasurement.Message = err.Error() | ||
newMeasurement.Message = providerErr.Error() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You are not returning to the caller which will make the function to continue the execution even after the error occurred. Is this new behaviour expected?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes it is expected and is the fix for the actual bug, this function is also in a go routing and the returned error also is not handled by letting the code actually continue we now get provider errors to end users via the MetricRun status etc. The return was there because the old logic had a requirement on the provider being valid in order to call GetMetadata() I reworked it so that is not required anymore and so it should be safe to continue the code and report the error.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
* change logic of analysis run to better haneld errors Signed-off-by: zachaller <[email protected]> * change logic to not call GetMetaData if not nil like the old behavior Signed-off-by: zachaller <[email protected]> * move code closer to usage Signed-off-by: zachaller <[email protected]> * change logic to not always call GetMetadata keeps original behavior Signed-off-by: zachaller <[email protected]> * fix logic Signed-off-by: zachaller <[email protected]> * cleanup Signed-off-by: zachaller <[email protected]> * add test Signed-off-by: zachaller <[email protected]> --------- Signed-off-by: zachaller <[email protected]>
The logic for error handling of anaysisruns was incorrect, we returned an error from a go routine that did not get used at all. This cause errors to not get bubbled up into the status field of the AR resource. This changes the logic around a bit to make sure we bubble up the error to the AR object as well as log it.
fixes: #2696