Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Nodejs instrumentation for scoped packages fails to install itself on Windows #4402

Closed
yandrushchak opened this issue Jan 5, 2024 · 1 comment · Fixed by #4467
Closed
Assignees
Labels
bug Something isn't working priority:p2 Bugs and spec inconsistencies which cause telemetry to be incomplete or incorrect

Comments

@yandrushchak
Copy link

What happened?

Steps to Reproduce

Prerequisite: Windows environment

  1. Add redis package to project.
  2. Add @opentelemetry/instrumentation-redis-4 instrumentation (either explicitly or using @opentelemetry/auto-instrumentations-node).

Redis is used as an example, should be reproducible for any scoped package (@redis/client package is instrumented internally in this case).

Expected Result

Redis is instrumented and redis-related spans are visible in traces.

Actual Result

Redis is not instrumented.

Additional Details

This happens for scoped packages due to differences in path separator between Windows and Unix system.

Internally, @opentelemetry/instrumentation-redis-4 package tries to instrument @redis/client/dist/lib/commander.js which on windows gets paths normalized to @redis\\client\\dist\\lib\\commander.js in the InstrumentationNodeModuleFile).

However, when require-in-the-middle hook resolves internal files, normalization only happens for the file path, and not for the module name itself. E.g. in this case @redis/client\\dist\\lib\\commander.js (source) is returned.

Due to this mismatch instrumentation is not applied - name comparison doesn't account for this.

Possible workaround - path normalization can be applied to the name returned from require-in-the-middle hook. The following fix worked for my setup:

import { normalize } from 'node:path';
import { InstrumentationBase } from '@opentelemetry/instrumentation';
import { getNodeAutoInstrumentations } from '@opentelemetry/auto-instrumentations-node';

const originalRequire = (InstrumentationBase as any).prototype._onRequire;
(InstrumentationBase as any).prototype._onRequire = function patchedOnRequire(
  module: string,
  exports: string,
  name: string,
  baseDir: string
) {
  return originalRequire.call(this, module, exports, normalize(name), baseDir);
};
const instrumentations = getNodeAutoInstrumentations();

OpenTelemetry Setup Code

import { normalize } from 'node:path';
import { InstrumentationBase } from '@opentelemetry/instrumentation';
import { getNodeAutoInstrumentations } from '@opentelemetry/auto-instrumentations-node';

const instrumentations = getNodeAutoInstrumentations();

import { NodeSDK } from '@opentelemetry/sdk-node';
import { OTLPTraceExporter } from '@opentelemetry/exporter-trace-otlp-proto';
import { OTLPMetricExporter } from '@opentelemetry/exporter-metrics-otlp-proto';
import { PeriodicExportingMetricReader } from '@opentelemetry/sdk-metrics';
import { diag, DiagConsoleLogger, DiagLogLevel } from '@opentelemetry/api';

diag.setLogger(new DiagConsoleLogger(), DiagLogLevel.INFO);

const otelSDK = new NodeSDK({
  traceExporter: new OTLPTraceExporter({
    url: process.env.OPENTELEMETRY_TRACE_EXPORTER_ENDPOINT,
  }),
  metricReader: new PeriodicExportingMetricReader({
    exporter: new OTLPMetricExporter({
      url: process.env.OPENTELEMETRY_METRICS_EXPORTER_ENDPOINT,
    }),
  }),
  serviceName: process.env.OPENTELEMETRY_SERVICE_NAME,
  instrumentations: [instrumentations],
});

process.on('SIGTERM', () => {
  otelSDK
    .shutdown()
    .then(
      () => console.log('OpenTelemetry SDK shut down successfully'),
      (err) => console.log('Error shutting down OpenTelemetry SDK', err)
    )
    .finally(() => process.exit(0));
});

otelSDK.start();

package.json

Relevant dependencies:
{ 
    "@opentelemetry/api": "^1.7.0",
    "@opentelemetry/auto-instrumentations-node": "^0.40.3",
    "@opentelemetry/context-async-hooks": "^1.19.0",
    "@opentelemetry/core": "^1.19.0",
    "@opentelemetry/exporter-metrics-otlp-proto": "^0.46.0",
    "@opentelemetry/exporter-trace-otlp-proto": "^0.46.0",
    "@opentelemetry/instrumentation": "^0.46.0",
    "@opentelemetry/sdk-metrics": "^1.19.0",
    "@opentelemetry/sdk-node": "^0.46.0",
    "cache-manager": "^5.3.2",
    "cache-manager-redis-store": "^3.0.1",
}

Relevant log output

No response

@yandrushchak yandrushchak added bug Something isn't working triage labels Jan 5, 2024
@dyladan dyladan added priority:p2 Bugs and spec inconsistencies which cause telemetry to be incomplete or incorrect and removed bug Something isn't working labels Jan 5, 2024
@dyladan
Copy link
Member

dyladan commented Jan 5, 2024

We can probably normalize the path in the instrumentation module before doing the comparison. We would just want to make sure we're not breaking anything else when we do that.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working priority:p2 Bugs and spec inconsistencies which cause telemetry to be incomplete or incorrect
Projects
None yet
3 participants