Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

from_context values isn't passed in the headers_setter if pipeline contains batch process #29271

Closed
romanvogman opened this issue Nov 14, 2023 · 7 comments
Labels
bug Something isn't working extension/headerssetter needs triage New item requiring triage waiting for author

Comments

@romanvogman
Copy link

Component(s)

No response

What happened?

Description

Using the latest image 0.88.0, I've noticed that the limitation that was perviously mentioned here was still happening - otel collector with batch process doesn't retrieve the header value passed in from_context section. Removing batch process from the pipeline resolves it.

Steps to Reproduce

Add batch process to pipelines and a headers setter section with from_conext
Example:

headers_setter config section:
    extensions:
      headers_setter:
        headers:
        - from_context: tenant_id
          key: X-Scope-OrgID

Headers sent:

POST /v1/traces HTTP/1.1
Host: localhost:4319
User-Agent: OTel-OTLP-Exporter-Python/1.20.0
Accept-Encoding: gzip, deflate
Accept: */*
Connection: keep-alive
tenant_id: foo
Content-Type: application/x-protobuf
Content-Length: 226 

Headers received:

POST /v1/traces HTTP/1.1
Host: ubuntu:4319
User-Agent: OpenTelemetry Collector Contrib/0.88.0 (linux/amd64)
Content-Length: 214
Content-Encoding: gzip
Content-Type: application/x-protobuf
X-Scope-Orgid:
Accept-Encoding: gzip

Removing batch process from the pipeline fixed this issue and X-Scope-Orgid header contains the expected value

Collector version

0.88.0

Environment information

No response

OpenTelemetry Collector configuration

No response

Log output

No response

Additional context

No response

@romanvogman romanvogman added bug Something isn't working needs triage New item requiring triage labels Nov 14, 2023
Copy link
Contributor

Pinging code owners for extension/headerssetter: @jpkrohling. See Adding Labels via Comments if you do not have permissions to add labels yourself.

@crobert-1
Copy link
Member

crobert-1 commented Nov 16, 2023

Hello @romanvogman, can you share the full configuration of the collector that you're using? I want to confirm that the metadata_keys option is set properly on the batch processor. Here's a link to what I'm referring to.

@jpkrohling
Copy link
Member

@romanvogman, can you confirm you used metadata_keys with the batch processor?

@romanvogman
Copy link
Author

romanvogman commented Nov 28, 2023

Hey,

Most of the configuration came as is from otel-collector helm chart
The batch processor is added from the default values.yaml, which we removed at this point to fix the headers_setter issue.
It's empty in the batch configuration, but is used in the pipeline section
https://github.com/open-telemetry/opentelemetry-helm-charts/blob/main/charts/opentelemetry-collector/values.yaml#L100
https://github.com/open-telemetry/opentelemetry-helm-charts/blob/main/charts/opentelemetry-collector/values.yaml#L141

Here's how it looks like when we apply it:

    processors:
      batch: {}


    service:
      pipelines:
        traces:
          exporters:
          - otlphttp
          processors:
          - memory_limiter
          - batch
          receivers:
          - otlp
      telemetry:
        metrics:
          address: ${env:MY_POD_IP}:8888

Haven't found reference to metadata_keys

@jpkrohling
Copy link
Member

Without that, the batch processor won't take the context into consideration, which explains what you are seeing. Please, take a look at the documentation for the batch processor, linked from @crobert-1's earlier comment:

https://github.com/open-telemetry/opentelemetry-collector/tree/main/processor/batchprocessor#batching-and-client-metadata

@romanvogman
Copy link
Author

Thanks for your feedback @jpkrohling, but in our case batch processor isn't needed (at the moment), it just wasn't working correctly with the default configuration from helm chart which has batch: {}

wish is was easier to find though, would have saved me a good few hours debugging it :)

@jpkrohling
Copy link
Member

Sorry about that -- where would be the place you'd have looked at to debug this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working extension/headerssetter needs triage New item requiring triage waiting for author
Projects
None yet
Development

No branches or pull requests

3 participants