Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove deprecated transports #452

Merged

Conversation

kornholi
Copy link
Contributor

@kornholi kornholi commented Jan 17, 2019

Closes #386.

@c24t
Copy link
Member

c24t commented Jan 17, 2019

Fixes #386.

@c24t
Copy link
Member

c24t commented Jan 17, 2019

Besides deleting the duplicate code this changes the batching for spans from 10/1sec to 200/60s and changes the way the queue is flushed from #384.

The new flushing behavior hasn't caused any problems for stats, and should be safe for traces. I'm interested to know how you tested the batch size change though. The new batch size is consistent with other clients, but it means that users sending between 200 and 600 spans/traces per minute now have to change the default.

@kornholi
Copy link
Contributor Author

We've been running with 200/60s for a couple months without issues, as the previous default ran through quota immediately. Perhaps we should bump up the batch size to 600?

@kornholi
Copy link
Contributor Author

kornholi commented Jan 17, 2019

Either way, I think we need to update the code to submit as many batches as possible per wait period. Otherwise services under high load will buffer stats/traces until they run out of memory, as only _DEFAULT_MAX_BATCH_SIZE entries can be submitted per minute.

@c24t
Copy link
Member

c24t commented Jan 17, 2019

Bumping the batch size up sounds good to me.

@c24t c24t merged commit 2857864 into census-instrumentation:master Jan 17, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants