You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am currently configuring the OpenTelemetry Collector to gather logs through the FileLog Receiver, which are then transmitted to Second OpenTelemetry Colector via oltphttp exporter. First OpenTelemetry Collector ingests logs through the FileLog Reciver and forwards them using OpenTelemetry Output. In this configuration, OpenTelemetry acts as a proxy for OpenTelemetry Colelctor.
When a log file was moved and processed by the first collector, both collectors reached their soft limits the same situation is when the reach hard limit. At this point, the second collector started rejecting data, redirecting it back to the receiver. As a result, the first OpenTelemetry Collector encountered errors for both logs and metrics.
I expected that the first collector would obtain a recoverable error, such as "too many requests," so when the second collector resumes normal operation, pending requests can be resent instead of being dropped.
Steps to reproduce
First Collector Started
Second Collector Started
Move Errorlog (1mln logs) file to destination
Turn off Second Collector
Turn off First Collector
Communication between 2 Instances of OpenTelemetry Collector with memory_limiter processors
PS C:\Users\kCuraCloudAdmin> C:\\collector\\otelcol-contrib_0.93.0_windows_amd64\\otelcol-contrib.exe --config C:\\collector\\otelcol-contrib_0.93.0_windows_amd64\\config.yaml
2024-02-12T12:12:06.088Z info [email protected]/service.go:165 Everything is ready. Begin running and processing data.
2024-02-12T12:12:07.229Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
"resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:12:25.077Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:12:25.078Z info exporterhelper/retry_sender.go:118 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.
0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "5.5346304s"}
2024-02-12T12:12:27.150Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:12:27.322Z info exporterhelper/retry_sender.go:118 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.
0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "6.706840488s"}
2024-02-12T12:12:27.475Z info [email protected]/collector.go:258 Received signal from OS {"signal": "interrupt"}
2024-02-12T12:12:27.475Z info [email protected]/service.go:179 Starting shutdown...
2024-02-12T12:12:27.479Z info adapter/receiver.go:140 Stopping stanza receiver {"kind": "receiver", "name": "filelog/continues", "data_type": "logs"}
2024-02-12T12:12:27.496Z info adapter/receiver.go:140 Stopping stanza receiver {"kind": "receiver", "name": "filelog", "data_type": "logs"}
2024-02-12T12:12:27.517Z error exporterhelper/common.go:95 Exporting failed. Dropping data. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "interrupted due to shutdown: failed to make an HTTP request: Post \"http://127.0.0.1
:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:12:27.523Z info extensions/extensions.go:59 Stopping extensions...
2024-02-12T12:12:27.523Z info [email protected]/service.go:193 Shutdown complete.
PS C:\Users\kCuraCloudAdmin> C:\\collector\\otelcol-contrib_0.93.0_windows_amd64\\otelcol-contrib.exe --config C:\\collector\\otelcol-contrib_0.93.0_windows_amd64\\config.yaml
2024-02-12T12:13:12.508Z info [email protected]/telemetry.go:76 Setting up own telemetry...
2024-02-12T12:13:12.508Z info [email protected]/telemetry.go:146 Serving metrics {"address": ":8888", "level": "Basic"}
2024-02-12T12:13:12.512Z info [email protected]/exporter.go:275 Deprecated component. Will be removed in future releases. {"kind": "exporter", "data_type": "logs", "name": "logging"}
2024-02-12T12:13:12.513Z info [email protected]/exporter.go:275 Deprecated component. Will be removed in future releases. {"kind": "exporter", "data_type": "metrics", "name": "logging"}
2024-02-12T12:13:12.514Z info memorylimiter/memorylimiter.go:77 Memory limiter configured {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "limit_mib": 400, "spike_limit_mib": 250, "check_interval": 1}
2024-02-12T12:13:12.514Z info [email protected]/service.go:139 Starting otelcol-contrib... {"Version": "0.93.0", "NumCPU": 2}
2024-02-12T12:13:12.515Z info extensions/extensions.go:34 Starting extensions...
2024-02-12T12:13:12.515Z info extensions/extensions.go:37 Extension is starting... {"kind": "extension", "name": "file_storage"}
2024-02-12T12:13:12.515Z info extensions/extensions.go:52 Extension started. {"kind": "extension", "name": "file_storage"}
2024-02-12T12:13:12.516Z info adapter/receiver.go:45 Starting stanza receiver {"kind": "receiver", "name": "filelog", "data_type": "logs"}
2024-02-12T12:13:12.547Z warn fileconsumer/file.go:51 finding files: no files match the configured criteria {"kind": "receiver", "name": "filelog", "data_type": "logs", "component": "fileconsumer"}
2024-02-12T12:13:12.567Z info adapter/receiver.go:45 Starting stanza receiver {"kind": "receiver", "name": "filelog/continues", "data_type": "logs"}
2024-02-12T12:13:12.582Z warn fileconsumer/file.go:51 finding files: no files match the configured criteria {"kind": "receiver", "name": "filelog/continues", "data_type": "logs", "component": "fileconsumer"}
2024-02-12T12:13:12.594Z info [email protected]/service.go:165 Everything is ready. Begin running and processing data.
2024-02-12T12:13:13.757Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:13.758Z info exporterhelper/retry_sender.go:118 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "3.935162309s"}
2024-02-12T12:13:15.631Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:15.636Z info exporterhelper/retry_sender.go:118 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "7.263222629s"}
2024-02-12T12:13:17.739Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:19.633Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:31.773Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:32.702Z info fileconsumer/file.go:268 Started watching file {"kind": "receiver", "name": "filelog", "data_type": "logs", "component": "fileconsumer", "path": "C:\\ERRORLOG"}
2024-02-12T12:13:32.708Z info LogsExporter {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:32.712Z info LogsExporter {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:33.759Z info LogsExporter {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:36.018Z info memorylimiter/memorylimiter.go:222 Memory usage is above soft limit. Forcing a GC. {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 155}
2024-02-12T12:13:36.087Z info LogsExporter {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:36.708Z error scraperhelper/scrapercontroller.go:200 Error scraping metrics {"kind": "receiver", "name": "hostmetrics", "data_type": "metrics", "error": "context deadline exceeded", "scraper": "cpu"}
go.opentelemetry.io/collector/receiver/scraperhelper.(*controller).scrapeMetricsAndReport
go.opentelemetry.io/collector/[email protected]/scraperhelper/scrapercontroller.go:200
go.opentelemetry.io/collector/receiver/scraperhelper.(*controller).startScraping.func1
go.opentelemetry.io/collector/[email protected]/scraperhelper/scrapercontroller.go:176
2024-02-12T12:13:36.714Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 1, "metrics": 1, "data points": 2}
2024-02-12T12:13:36.839Z info LogsExporter {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:36.922Z info memorylimiter/memorylimiter.go:192 Memory usage after GC. {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 130}
2024-02-12T12:13:36.923Z info LogsExporter {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:37.532Z warn memorylimiter/memorylimiter.go:229 Memory usage is above soft limit. Refusing data. {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 210}
2024-02-12T12:13:38.519Z info memorylimiter/memorylimiter.go:215 Memory usage back within limits. Resuming normal operation. {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 101}
2024-02-12T12:13:39.664Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:41.769Z error exporterhelper/common.go:95 Exporting failed. Dropping data. {"kind": "exporter", "data_type": "logs", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/logs responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 1000}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createLogsExporter.WithQueue.func5.1
go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:13:42.239Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:43.294Z info LogsExporter {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:43.308Z info LogsExporter {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:43.695Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:44.224Z error exporterhelper/common.go:95 Exporting failed. Dropping data. {"kind": "exporter", "data_type": "logs", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/logs responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 1000}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createLogsExporter.WithQueue.func5.1
go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:13:44.297Z info LogsExporter {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:44.305Z info LogsExporter {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:45.313Z info LogsExporter {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:45.331Z info LogsExporter {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:45.606Z error exporterhelper/common.go:95 Exporting failed. Dropping data. {"kind": "exporter", "data_type": "logs", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/logs responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 1000}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createLogsExporter.WithQueue.func5.1
go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:13:45.767Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:52.831Z info LogsExporter {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 1, "log records": 1}
2024-02-12T12:13:52.832Z error exporterhelper/common.go:95 Exporting failed. Dropping data. {"kind": "exporter", "data_type": "logs", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/logs responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 1}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createLogsExporter.WithQueue.func5.1
go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:13:53.720Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:01.684Z error exporterhelper/common.go:95 Exporting failed. Dropping data. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:03.768Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:27.767Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:27.768Z error exporterhelper/common.go:95 Exporting failed. Dropping data. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
2024-02-12T12:14:33.598Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:33.600Z error exporterhelper/common.go:95 Exporting failed. Dropping data. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:35.669Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:35.670Z error exporterhelper/common.go:95 Exporting failed. Dropping data. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:37.743Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:49.606Z error exporterhelper/common.go:95 Exporting failed. Dropping data. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:51.686Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:51.688Z error exporterhelper/common.go:95 Exporting failed. Dropping data. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:53.763Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:53.764Z info exporterhelper/retry_sender.go:118 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "2.528889573s"}
2024-02-12T12:14:55.635Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:55.637Z info exporterhelper/retry_sender.go:118 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "4.469254533s"}
2024-02-12T12:14:56.312Z info exporterhelper/retry_sender.go:118 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "3.781748409s"}
2024-02-12T12:14:57.723Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:57.724Z info exporterhelper/retry_sender.go:118 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "5.922314888s"}
2024-02-12T12:14:59.590Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:59.591Z info exporterhelper/retry_sender.go:118 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "5.101666595s"}
2024-02-12T12:15:00.106Z info exporterhelper/retry_sender.go:118 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "7.260620386s"}
2024-02-12T12:15:00.124Z info exporterhelper/retry_sender.go:118 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "10.609948295s"}
2024-02-12T12:15:01.673Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:15:01.673Z info exporterhelper/retry_sender.go:118 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "6.600585342s"}
2024-02-12T12:15:03.661Z info exporterhelper/retry_sender.go:118 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "3.893131851s"}
2024-02-12T12:15:03.758Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:15:03.761Z info exporterhelper/retry_sender.go:118 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "5.631966261s"}
2024-02-12T12:15:04.714Z info exporterhelper/retry_sender.go:118 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "7.862112453s"}
2024-02-12T12:15:05.622Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:15:05.622Z info exporterhelper/retry_sender.go:118 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "6.964815392s"}
2024-02-12T12:15:07.381Z info exporterhelper/retry_sender.go:118 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "12.618459834s"}
2024-02-12T12:15:07.558Z info exporterhelper/retry_sender.go:118 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "11.67278761s"}
2024-02-12T12:15:07.698Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:15:07.699Z info exporterhelper/retry_sender.go:118 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "4.90095434s"}
2024-02-12T12:15:08.280Z info exporterhelper/retry_sender.go:118 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "7.703637679s"}
2024-02-12T12:15:09.409Z info exporterhelper/retry_sender.go:118 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "7.030750379s"}
2024-02-12T12:15:09.771Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:15:09.773Z info exporterhelper/retry_sender.go:118 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "6.588990315s"}
2024-02-12T12:15:10.748Z info exporterhelper/retry_sender.go:118 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "16.49570526s"}
2024-02-12T12:15:10.864Z info [email protected]/collector.go:258 Received signal from OS {"signal": "interrupt"}
2024-02-12T12:15:10.864Z info [email protected]/service.go:179 Starting shutdown...
2024-02-12T12:15:10.869Z info adapter/receiver.go:140 Stopping stanza receiver {"kind": "receiver", "name": "filelog/continues", "data_type": "logs"}
2024-02-12T12:15:10.884Z info adapter/receiver.go:140 Stopping stanza receiver {"kind": "receiver", "name": "filelog", "data_type": "logs"}
2024-02-12T12:15:10.897Z error exporterhelper/common.go:95 Exporting failed. Dropping data. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "interrupted due to shutdown: failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:15:10.898Z error exporterhelper/common.go:95 Exporting failed. Dropping data. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "interrupted due to shutdown: failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:15:10.898Z error exporterhelper/common.go:95 Exporting failed. Dropping data. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "interrupted due to shutdown: failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:15:10.898Z error exporterhelper/common.go:95 Exporting failed. Dropping data. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "interrupted due to shutdown: failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:15:10.898Z error exporterhelper/common.go:95 Exporting failed. Dropping data. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "interrupted due to shutdown: failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:15:10.898Z error exporterhelper/common.go:95 Exporting failed. Dropping data. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "interrupted due to shutdown: failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:15:10.898Z error exporterhelper/common.go:95 Exporting failed. Dropping data. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "interrupted due to shutdown: failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:15:10.905Z info extensions/extensions.go:59 Stopping extensions...
2024-02-12T12:15:10.906Z info [email protected]/service.go:193 Shutdown complete.
Second OpenTelemetry Collector Configuration
receivers:
otlp:
protocols:
http:
endpoint: "127.0.0.1:4320"
exporters:
logging:
otlphttp:
endpoint: http://127.0.0.1:4318
tls:
insecure: true
processors:
batch:
send_batch_size: 1000
memory_limiter:
check_interval: 1s
limit_mib: 400
spike_limit_mib: 250
service:
telemetry:
metrics:
address: "0.0.0.0:9090" # Use a port that you know is free
pipelines:
logs:
receivers: [otlp]
processors: [memory_limiter,batch]
exporters: [logging, otlphttp]
metrics:
receivers: [otlp]
processors: [memory_limiter,batch]
exporters: [logging, otlphttp]
Log output for second Collector
PS C:\Users\kCuraCloudAdmin> C:\\collector\\second\\otelcol-contrib.exe --config C:\\collector\\second\\config.yaml
2024-02-12T12:13:15.717Z info [email protected]/telemetry.go:76 Setting up own telemetry...
2024-02-12T12:13:15.718Z info [email protected]/telemetry.go:146 Serving metrics {"address": "0.0.0.0:9090", "level": "Basic"}
2024-02-12T12:13:15.723Z info [email protected]/exporter.go:275 Deprecated component. Will be removed in future releases. {"kind": "exporter", "data_type": "metrics", "name": "logging"}
2024-02-12T12:13:15.730Z info memorylimiter/memorylimiter.go:77 Memory limiter configured {"kind": "processor", "name": "memory_limiter", "pipeline": "metrics", "limit_mib": 400, "spike_limit_mib": 250, "check_interval": 1}
2024-02-12T12:13:15.730Z info [email protected]/exporter.go:275 Deprecated component. Will be removed in future releases. {"kind": "exporter", "data_type": "logs", "name": "logging"}
2024-02-12T12:13:15.730Z info [email protected]/service.go:139 Starting otelcol-contrib... {"Version": "0.93.0", "NumCPU": 2}
2024-02-12T12:13:15.730Z info extensions/extensions.go:34 Starting extensions...
2024-02-12T12:13:15.734Z info [email protected]/otlp.go:152 Starting HTTP server {"kind": "receiver", "name": "otlp", "data_type": "metrics", "endpoint": "127.0.0.1:4320"}
2024-02-12T12:13:15.739Z info [email protected]/service.go:165 Everything is ready. Begin running and processing data.
2024-02-12T12:13:17.806Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 4, "metrics": 4, "data points": 20}
2024-02-12T12:13:19.698Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:21.796Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:23.056Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:23.680Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:25.786Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:27.677Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:29.754Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:31.838Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:32.720Z info LogsExporter {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:32.732Z info LogsExporter {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:33.759Z info LogsExporter {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:33.887Z info LogsExporter {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:34.838Z info LogsExporter {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 11, "log records": 1010}
2024-02-12T12:13:34.916Z info LogsExporter {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:36.044Z info LogsExporter {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:36.091Z info LogsExporter {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:36.922Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 1, "metrics": 1, "data points": 2}
2024-02-12T12:13:37.077Z info LogsExporter {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:37.098Z info LogsExporter {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:38.733Z info memorylimiter/memorylimiter.go:222 Memory usage is above soft limit. Forcing a GC. {"kind": "processor", "name": "memory_limiter", "pipeline": "metrics", "cur_mem_mib": 164}
2024-02-12T12:13:39.139Z info memorylimiter/memorylimiter.go:192 Memory usage after GC. {"kind": "processor", "name": "memory_limiter", "pipeline": "metrics", "cur_mem_mib": 107}
2024-02-12T12:13:39.806Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:41.734Z warn memorylimiter/memorylimiter.go:229 Memory usage is above soft limit. Refusing data. {"kind": "processor", "name": "memory_limiter", "pipeline": "metrics", "cur_mem_mib": 150}
2024-02-12T12:13:42.732Z info memorylimiter/memorylimiter.go:215 Memory usage back within limits. Resuming normal operation. {"kind": "processor", "name": "memory_limiter", "pipeline": "metrics", "cur_mem_mib": 132}
2024-02-12T12:13:42.742Z info LogsExporter {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:42.750Z info LogsExporter {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:43.734Z warn memorylimiter/memorylimiter.go:229 Memory usage is above soft limit. Refusing data. {"kind": "processor", "name": "memory_limiter", "pipeline": "metrics", "cur_mem_mib": 228}
2024-02-12T12:13:43.908Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:52.290Z info [email protected]/collector.go:258 Received signal from OS {"signal": "interrupt"}
2024-02-12T12:14:52.290Z info [email protected]/service.go:179 Starting shutdown...
2024-02-12T12:14:52.305Z info extensions/extensions.go:59 Stopping extensions...
2024-02-12T12:14:52.305Z info [email protected]/service.go:193 Shutdown complete.
Results
Test Scenario
First Collector Started
Second Collector Started
Move Errorlog (1mln logs) file to destination
Turn off Second Collector
Turn off First Collector
Initially, the OpenTelemetry Collector was unable to send metrics to the second Collector and encountered a retryable error.
2024-02-12T12:13:13.758Z info exporterhelper/retry_sender.go:118 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "3.935162309s"}
After the second collector became available, metrics were successfully transmitted. When a log file was moved and processed by the first collector, both collectors reached their soft limits the same situation is when the reach hard limit. At this point, the second collector started rejecting data, redirecting it back to the receiver. As a result, the first OpenTelemetry Collector encountered errors for both logs and metrics.
2024-02-12T12:14:33.600Z error exporterhelper/common.go:95 Exporting failed. Dropping data. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
I expected that the first collector would obtain a recoverable error, such as "too many requests," so when the second collector resumes normal operation, pending requests can be resent instead of being dropped.
At the end, after shutting down the second collector, the first collector displayed an error
2024-02-12T12:15:09.773Z info exporterhelper/retry_sender.go:118 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "6.588990315s"}
The text was updated successfully, but these errors were encountered:
marcinsiennicki95
changed the title
[exporter/otlphttp] Responded with HTTP Status Code 500 instead of 429 if soft memory limit is reached
[receiver/otlp] Responded with HTTP Status Code 500 instead of 429 if soft memory limit is reached
Feb 26, 2024
I found OpenTelemtry specification how http error codes should be handled. It looks like the current behavior is not implemented correctly, and the error codes are not followed.
If I understand correctly, when setting up a chain like this: Collector 1 (client) with the OLTP exporter (or fluent-bit with output OLTP), and Collector 2 (server) with the OLTP receiver, the receiver on Collector 2 should return handle HTTP status codes as specified in the specification
Describe the bug
I am currently configuring the OpenTelemetry Collector to gather logs through the FileLog Receiver, which are then transmitted to Second OpenTelemetry Colector via oltphttp exporter. First OpenTelemetry Collector ingests logs through the FileLog Reciver and forwards them using OpenTelemetry Output. In this configuration, OpenTelemetry acts as a proxy for OpenTelemetry Colelctor.
When a log file was moved and processed by the first collector, both collectors reached their soft limits the same situation is when the reach hard limit. At this point, the second collector started rejecting data, redirecting it back to the receiver. As a result, the first OpenTelemetry Collector encountered errors for both logs and metrics.
I expected that the first collector would obtain a recoverable error, such as "too many requests," so when the second collector resumes normal operation, pending requests can be resent instead of being dropped.
Steps to reproduce
Communication between 2 Instances of OpenTelemetry Collector with memory_limiter processors
First OpenTelemetry Collector configuration
Log output for first Collector
Second OpenTelemetry Collector Configuration
Log output for second Collector
Results
Test Scenario
Initially, the OpenTelemetry Collector was unable to send metrics to the second Collector and encountered a retryable error.
After the second collector became available, metrics were successfully transmitted. When a log file was moved and processed by the first collector, both collectors reached their soft limits the same situation is when the reach hard limit. At this point, the second collector started rejecting data, redirecting it back to the receiver. As a result, the first OpenTelemetry Collector encountered errors for both logs and metrics.
I expected that the first collector would obtain a recoverable error, such as "too many requests," so when the second collector resumes normal operation, pending requests can be resent instead of being dropped.
At the end, after shutting down the second collector, the first collector displayed an error
The text was updated successfully, but these errors were encountered: