Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tailsampler: Combine batches of spans into a single batch #1864

Merged
merged 3 commits into from
Oct 1, 2020

Conversation

chris-smith-zocdoc
Copy link
Contributor

Description:
Adding feature #1834

This changes the tail sampler to emit all spans for a single traces in a single batch. This allows downstream processors to easily perform operations on the entire trace.

Testing:
Added test

…nstream processors may operate on the entire trace
@codecov
Copy link

codecov bot commented Sep 25, 2020

Codecov Report

Merging #1864 into master will increase coverage by 0.02%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #1864      +/-   ##
==========================================
+ Coverage   91.23%   91.26%   +0.02%     
==========================================
  Files         272      272              
  Lines       16263    16266       +3     
==========================================
+ Hits        14838    14845       +7     
+ Misses        998      996       -2     
+ Partials      427      425       -2     
Impacted Files Coverage Δ
...mplingprocessor/tailsamplingprocessor/processor.go 74.40% <100.00%> (+0.36%) ⬆️
translator/internaldata/resource_to_oc.go 91.48% <0.00%> (+4.25%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 645eab3...4513ee3. Read the comment docs.

}

_ = tsp.nextConsumer.ConsumeTraces(policy.ctx, allSpans)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps not for this PR, but in case an error happens, would be nice to have it logged, even if at debug level.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah we can do this in a separate pr

}
}

func findTrace(a []pdata.Traces, traceID pdata.TraceID) int {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The name is a bit misleading, as it's not finding a trace, it's returning the number of traces that were found for the given ID. Suggestion: numTracesWithID()?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is probably a better way to do this, but its finding the Index of the trace in the array. I added this because there isn't a guaranteed order in which the traces are received

In the case where the trace is not found it returns the size of the array which might be confusing, I could change this to -1 if you'd like.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would using t.Fatal() or assert.Fail() be more appropriate since it's invalid test state?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've updated this method to return the trace or nil which makes it much clearer

span.SetSpanID(tracetranslator.UInt64ToByteSpanID(uint64(i + 1)))

spanID++
span.SetSpanID(tracetranslator.UInt64ToByteSpanID(uint64(spanID)))
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

previously this was creating duplicate span ids, now its a monotonic sequence. the other tests don't rely on the spanid it seems

@bogdandrutu bogdandrutu merged commit 07e8d7b into open-telemetry:master Oct 1, 2020
@chris-smith-zocdoc chris-smith-zocdoc deleted the issue_1834 branch October 1, 2020 19:00
MovieStoreGuy pushed a commit to atlassian-forks/opentelemetry-collector that referenced this pull request Nov 11, 2021
hughesjj pushed a commit to hughesjj/opentelemetry-collector that referenced this pull request Apr 27, 2023
* Update core/contrib deps to v0.58.0

* Update config source provider

* Add make fmt tidy and adopt fixes

* create internal/tools module for project

* use wget instead of now missing curl in kafka tests

Co-authored-by: Ryan Fitzpatrick <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants