Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: improve metric reporting #376

Merged
merged 1 commit into from
Feb 9, 2021
Merged

feat: improve metric reporting #376

merged 1 commit into from
Feb 9, 2021

Conversation

iand
Copy link
Contributor

@iand iand commented Feb 1, 2021

Various cleanups and improvements:

  • ensure only tasks passed on command line are used in task tag on metrics
  • add tag for actor type (for processing metrics)
  • add tag for table name (for persistence metrics)
  • add count of failed processing and persistence functions
  • export the count of lens calls so it is reported properly
  • add metrics to some missed lens calls

@iand iand force-pushed the feat/indexer-metrics branch 4 times, most recently from 87940a3 to 9beb23c Compare February 1, 2021 12:45
@codecov-io
Copy link

Codecov Report

Merging #376 (9beb23c) into master (ad4a96a) will decrease coverage by 0.1%.
The diff coverage is 18.1%.

@@           Coverage Diff            @@
##           master    #376     +/-   ##
========================================
- Coverage    43.1%   42.9%   -0.2%     
========================================
  Files          25      25             
  Lines        1932    1941      +9     
========================================
  Hits          833     833             
- Misses        972     981      +9     
  Partials      127     127             

@iand iand force-pushed the feat/indexer-metrics branch 5 times, most recently from 8e53c3f to 42eee3e Compare February 1, 2021 15:55
@iand iand changed the title feat: record actor task metrics feat: improve metric reporting Feb 1, 2021
Copy link
Contributor

@placer14 placer14 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One question about metric capturing around batches vs individual model persistence. Approval is sticky. 🤝

@@ -34,6 +41,10 @@ func (actors ActorList) Persist(ctx context.Context, s model.StorageBatch) error
ctx, span := global.Tracer("").Start(ctx, "ActorList.Persist", trace.WithAttributes(label.Int("count", len(actors))))
defer span.End()

ctx, _ = tag.New(ctx, tag.Upsert(metrics.Table, "actors"))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we be tagging metrics for list persistence the same as individual persistence? Seems this would record duration for a batch and individuals just the same and might make the data unreliable. (If you change this, it appears this is the same for all the models.)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think it matters. The list could be a single item and we would want that to be treated identically to the individual case. We're aiming to measure how long visor is taking to persist all the data it extracts, not necessarily per item timings .

@iand iand merged commit b3a5127 into master Feb 9, 2021
@iand iand deleted the feat/indexer-metrics branch February 9, 2021 11:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants