Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

receiver/otlp to exporter/syslog not delivering expected output #35317

Open
bluestripe opened this issue Sep 20, 2024 · 2 comments
Open

receiver/otlp to exporter/syslog not delivering expected output #35317

bluestripe opened this issue Sep 20, 2024 · 2 comments
Labels
bug Something isn't working exporter/syslog needs triage New item requiring triage Stale

Comments

@bluestripe
Copy link

Component(s)

exporter/syslog

What happened?

Description

When trying to export otlp log messages to syslog, the resulting message on the syslog end only contains the timestamp from the otlp message. The "message" and the "priority" do not get picked up.

Steps to reproduce

Run docker compose command:

docker compose --project-name syslog-issue up --detach && docker exec syslog-ng bash -c 'chown -R abc:abc /var/log'

with the following docker-compose.yml:

services:

  otel-collector:
    command: ["--config=/conf/config.yaml"]
    configs:
      - source: otel-config
        target: /conf/config.yaml
    container_name: otel-collector
    image: otel/opentelemetry-collector-contrib:0.109.0
    networks:
      log-net:
        ipv4_address: 172.19.0.2
    ports:
      - 4318:4318
    restart: unless-stopped

  syslog-server:
    container_name: syslog-ng
    image: lscr.io/linuxserver/syslog-ng:4.7.1
    networks:
      log-net:
        ipv4_address: 172.19.0.3
    ports:
      - 6601:6601/tcp
    restart: unless-stopped

networks:
  log-net:
    driver: bridge
    ipam:
      config:
        - subnet: 172.19.0.0/24
          gateway: 172.19.0.1

configs:
  otel-config:
    content: |
      receivers:
        otlp:
          protocols:
            http:
              endpoint: 172.19.0.2:4318

      exporters:
        debug:
          verbosity: detailed
        syslog:
          endpoint: 172.19.0.3
          network: tcp
          port: 6601
          protocol: rfc5424
          tls:
            insecure: true
          enable_octet_counting: true

      processors:
        batch:

      service:
        telemetry:
          logs:
            level: "debug"
        pipelines:
          logs:
            receivers: [otlp]
            processors: [batch]
            exporters: [debug,syslog]

Watch incoming message as it has been sent by the otel-collector from the syslog-ng container:

docker exec syslog-ng bash -c 'apk add tcpdump && tcpdump -A -nnn tcp and host 172.19.0.2'

Use curl to send an otlp message to the collector from the docker host:

curl -X POST -H "Content-Type: application/json" -d @logs.json -i localhost:4318/v1/logs

With below content for logs.json:

{"resourceLogs":[{"scopeLogs":[{"logRecords":[{"timeUnixNano":"1544712660300000000","observedTimeUnixNano":"1544712660300000000","severityNumber":9,"severityText":"info","body":{"stringValue":"Example log record"}}]}]}]}

Expected result

<9>1 2018-12-13T14:51:00.3Z - - - - - Example log record

Actual result

<165>1 2018-12-13T14:51:00.3Z - - - - -

The timestamp is picked up from the otel log entry. Two places where this result deviates from expectations:

  • The severity number is returned as 165 instead of 9 from the input given in severityNumber
  • Also the body does not get picked up as the message in the exported syslog entry.

Is this by design, or is this exporter/syslog not working as intended?

Workaround

By adding a transform to the otel-collector configuration, the expected result can be achived:

...
processors:
  batch:
  transform/syslog:
    log_statements:
      - context: log
        statements:
          - set(attributes["message"], body)
          - set(attributes["priority"], severity_number)

service:
  telemetry:
    logs:
      level: "debug"
  pipelines:
    logs:
      receivers: [otlp]
      processors: [transform/syslog,batch]
      exporters: [debug,syslog]

Which now renders the result as expected:

<9>1 2018-12-13T14:51:00.3Z - - - - - Example log record

Is this as expected, and should a transformation for all expected attributes always be added?

Collector version

0.109.0

Environment information

Environment

Docker desktop 4.33.0 (160616) on MacOS X 15.0

OpenTelemetry Collector configuration

receivers:
  otlp:
    protocols:
      http:
        endpoint: 172.19.0.2:4318

exporters:
  debug:
    verbosity: detailed
  syslog:
    endpoint: 172.19.0.3
    network: tcp
    port: 6601
    protocol: rfc5424
    tls:
      insecure: true
    enable_octet_counting: true

processors:
  batch:

service:
  telemetry:
    logs:
      level: "debug"
  pipelines:
    logs:
      receivers: [otlp]
      processors: [batch]
      exporters: [debug,syslog]

Log output

2024-09-20T10:51:47.230Z	info	[email protected]/service.go:129	Setting up own telemetry...
2024-09-20T10:51:47.230Z	warn	[email protected]/service.go:196	service::telemetry::metrics::address is being deprecated in favor of service::telemetry::metrics::readers
2024-09-20T10:51:47.230Z	info	[email protected]/telemetry.go:98	Serving metrics	{"address": ":8888", "metrics level": "Normal"}
2024-09-20T10:51:47.230Z	info	builders/builders.go:26	Development component. May change in the future.{"kind": "exporter", "data_type": "logs", "name": "debug"}
2024-09-20T10:51:47.230Z	debug	builders/builders.go:24	Alpha component. May change in the future.	{"kind": "exporter", "data_type": "logs", "name": "syslog"}
2024-09-20T10:51:47.230Z	info	[email protected]/exporter.go:45	Syslog Exporter configured	{"kind": "exporter", "data_type": "logs", "name": "syslog", "endpoint": "172.19.0.3", "protocol": "rfc5424", "network": "tcp", "port": 6601}
2024-09-20T10:51:47.230Z	debug	builders/builders.go:24	Beta component. May change in the future.	{"kind": "processor", "name": "batch", "pipeline": "logs"}
2024-09-20T10:51:47.230Z	debug	builders/builders.go:24	Beta component. May change in the future.	{"kind": "receiver", "name": "otlp", "data_type": "logs"}
2024-09-20T10:51:47.231Z	info	[email protected]/service.go:213	Starting otelcol-contrib...	{"Version": "0.109.0", "NumCPU": 10}
2024-09-20T10:51:47.231Z	info	extensions/extensions.go:39	Starting extensions...
2024-09-20T10:51:47.231Z	info	[email protected]/otlp.go:153	Starting HTTP server	{"kind": "receiver", "name": "otlp", "data_type": "logs", "endpoint": "172.19.0.2:4318"}
2024-09-20T10:51:47.231Z	info	[email protected]/service.go:239	Everything is ready. Begin running and processing data.
2024-09-20T10:51:47.231Z	info	localhostgate/featuregate.go:63	The default endpoints for all servers in components have changed to use localhost instead of 0.0.0.0. Disable the feature gate to temporarily revert to the previous default.	{"feature gate ID": "component.UseLocalHostAsDefaultHost"}
2024/09/20 10:52:00 http: superfluous response.WriteHeader call from go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp/internal/request.(*RespWriterWrapper).writeHeader (resp_writer_wrapper.go:78)
2024-09-20T10:52:01.056Z	info	LogsExporter	{"kind": "exporter", "data_type": "logs", "name": "debug", "resource logs": 1, "log records": 1}
2024-09-20T10:52:01.057Z	info	ResourceLog #0
Resource SchemaURL:
ScopeLogs #0
ScopeLogs SchemaURL:
InstrumentationScope
LogRecord #0
ObservedTimestamp: 2018-12-13 14:51:00.3 +0000 UTC
Timestamp: 2018-12-13 14:51:00.3 +0000 UTC
SeverityText: info
SeverityNumber: Info(9)
Body: Str(Example log record)
Trace ID:
Span ID:
Flags: 0
	{"kind": "exporter", "data_type": "logs", "name": "debug"}

Additional context

No response

@bluestripe bluestripe added bug Something isn't working needs triage New item requiring triage labels Sep 20, 2024
Copy link
Contributor

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label Nov 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working exporter/syslog needs triage New item requiring triage Stale
Projects
None yet
Development

No branches or pull requests

1 participant