Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Otel-collector-contrib with prometheus exporter missing exemplars (TraceId and SpanId) #30197

Closed
dotnetstep opened this issue Dec 23, 2023 · 8 comments

Comments

@dotnetstep
Copy link

Component(s)

exporter/prometheus

What happened?

Description

I am running dotnet application and it is using latest dotnet 8 and OpenTelemetry sdk. Application is sending Metrics to otlp port 4317 and otel-collecotor-contrib docker conitainer configured to receive, process and export in prometheus format. When application sending metrics, it include exemplars and it successfully receive by otel-collector-contrib receiver and process. When it export to debugexporter it has traceid, spanid but prometheus exporter does not contain those information.

Steps to Reproduce

  1. Run otel-collector-contrib:latest in docker container with configuration provided below.

Expected Result

Actual Result

Collector version

docker latest version

Environment information

Environment

OS: Windows and WSL2 (Docker install inside WSL 2 ubuntu distro)

Attached zip file.

OpenTelemetry Collector configuration

receivers:
  otlp:
    protocols:
      grpc:
      http:

processors:
  batch:
    timeout: 1s
    send_batch_size: 1024  

exporters:
  prometheus: # metrics
    endpoint: "0.0.0.0:8889"
    resource_to_telemetry_conversion:
      enabled: true    
    send_timestamps: true
    enable_open_metrics: true

  logging:
    verbosity: detailed

service:
  pipelines:
    metrics:
      receivers: [otlp]
      processors: [batch]
      exporters: [logging, prometheus]

Log output

ScopeMetrics #4
ScopeMetrics SchemaURL:
InstrumentationScope Examples.AspNetCore 1.0.0.0
Metric #0
Descriptor:
     -> Name: weather.days.freezing
     -> Description: The number of days where the temperature is below freezing
     -> Unit:
     -> DataType: Sum
     -> IsMonotonic: true
     -> AggregationTemporality: Cumulative
NumberDataPoints #0
StartTimestamp: 2023-12-23 14:06:49.9186663 +0000 UTC
Timestamp: 2023-12-23 14:22:31.7888414 +0000 UTC
Value: 15
Metric #1
Descriptor:
     -> Name: weather.days.test
     -> Description:
     -> Unit:
     -> DataType: Histogram
     -> AggregationTemporality: Cumulative
HistogramDataPoints #0
Data point attributes:
     -> JRecordTime: Str(5e5976f03750c0666579d5510772be63)
StartTimestamp: 2023-12-23 14:06:49.9231703 +0000 UTC
Timestamp: 2023-12-23 14:22:31.7888547 +0000 UTC
Count: 1
Sum: 108.000000
Min: 108.000000
Max: 108.000000
ExplicitBounds #0: 0.000000
ExplicitBounds #1: 5.000000
ExplicitBounds #2: 10.000000
ExplicitBounds #3: 25.000000
ExplicitBounds #4: 50.000000
ExplicitBounds #5: 75.000000
ExplicitBounds #6: 100.000000
ExplicitBounds #7: 250.000000
ExplicitBounds #8: 500.000000
ExplicitBounds #9: 750.000000
ExplicitBounds #10: 1000.000000
ExplicitBounds #11: 2500.000000
ExplicitBounds #12: 5000.000000
ExplicitBounds #13: 7500.000000
ExplicitBounds #14: 10000.000000
Buckets #0, Count: 0
Buckets #1, Count: 0
Buckets #2, Count: 0
Buckets #3, Count: 0
Buckets #4, Count: 0
Buckets #5, Count: 0
Buckets #6, Count: 0
Buckets #7, Count: 1
Buckets #8, Count: 0
Buckets #9, Count: 0
Buckets #10, Count: 0
Buckets #11, Count: 0
Buckets #12, Count: 0
Buckets #13, Count: 0
Buckets #14, Count: 0
Buckets #15, Count: 0
Exemplars:
Exemplar #0
     -> Trace ID: 5e5976f03750c0666579d5510772be63
     -> Span ID: bcfc4f2a812f7eb0
     -> Timestamp: 2023-12-23 14:06:51.0313611 +0000 UTC
     -> Value: 108.000000
HistogramDataPoints #1
Data point attributes:
     -> JRecordTime: Str(7cd584789f90adf922820c1a63452ee3)
StartTimestamp: 2023-12-23 14:06:49.9231703 +0000 UTC
Timestamp: 2023-12-23 14:22:31.7888547 +0000 UTC
Count: 1
Sum: 364.000000
Min: 364.000000
Max: 364.000000
ExplicitBounds #0: 0.000000
ExplicitBounds #1: 5.000000
ExplicitBounds #2: 10.000000
ExplicitBounds #3: 25.000000
ExplicitBounds #4: 50.000000
ExplicitBounds #5: 75.000000
ExplicitBounds #6: 100.000000
ExplicitBounds #7: 250.000000
ExplicitBounds #8: 500.000000
ExplicitBounds #9: 750.000000
ExplicitBounds #10: 1000.000000
ExplicitBounds #11: 2500.000000
ExplicitBounds #12: 5000.000000
ExplicitBounds #13: 7500.000000
ExplicitBounds #14: 10000.000000
Buckets #0, Count: 0
Buckets #1, Count: 0
Buckets #2, Count: 0
Buckets #3, Count: 0
Buckets #4, Count: 0
Buckets #5, Count: 0
Buckets #6, Count: 0
Buckets #7, Count: 0
Buckets #8, Count: 1
Buckets #9, Count: 0
Buckets #10, Count: 0
Buckets #11, Count: 0
Buckets #12, Count: 0
Buckets #13, Count: 0
Buckets #14, Count: 0
Buckets #15, Count: 0
Exemplars:
Exemplar #0
     -> Trace ID: 7cd584789f90adf922820c1a63452ee3
     -> Span ID: 34f8d6a5307230f2
     -> Timestamp: 2023-12-23 14:07:14.996346 +0000 UTC
     -> Value: 364.000000
HistogramDataPoints #2
Data point attributes:
     -> JRecordTime: Str(cc9375841b67cedcfeb5ca303e1bc627)
StartTimestamp: 2023-12-23 14:06:49.9231703 +0000 UTC
Timestamp: 2023-12-23 14:22:31.7888547 +0000 UTC
Count: 1
Sum: 298.000000
Min: 298.000000
Max: 298.000000
ExplicitBounds #0: 0.000000
ExplicitBounds #1: 5.000000
ExplicitBounds #2: 10.000000
ExplicitBounds #3: 25.000000
ExplicitBounds #4: 50.000000
ExplicitBounds #5: 75.000000
ExplicitBounds #6: 100.000000
ExplicitBounds #7: 250.000000
ExplicitBounds #8: 500.000000
ExplicitBounds #9: 750.000000
ExplicitBounds #10: 1000.000000
ExplicitBounds #11: 2500.000000
ExplicitBounds #12: 5000.000000
ExplicitBounds #13: 7500.000000
ExplicitBounds #14: 10000.000000
Buckets #0, Count: 0
Buckets #1, Count: 0
Buckets #2, Count: 0
Buckets #3, Count: 0
Buckets #4, Count: 0
Buckets #5, Count: 0
Buckets #6, Count: 0
Buckets #7, Count: 0
Buckets #8, Count: 1
Buckets #9, Count: 0
Buckets #10, Count: 0
Buckets #11, Count: 0
Buckets #12, Count: 0
Buckets #13, Count: 0
Buckets #14, Count: 0
Buckets #15, Count: 0
Exemplars:
Exemplar #0
     -> Trace ID: cc9375841b67cedcfeb5ca303e1bc627
     -> Span ID: 6d0e0f88d88344cf
     -> Timestamp: 2023-12-23 14:07:16.3220987 +0000 UTC
     -> Value: 298.000000
HistogramDataPoints #3
Data point attributes:
     -> JRecordTime: Str(2cbbe620b988d9355d18b71180b019ae)
StartTimestamp: 2023-12-23 14:06:49.9231703 +0000 UTC
Timestamp: 2023-12-23 14:22:31.7888547 +0000 UTC
Count: 1
Sum: 158.000000
Min: 158.000000
Max: 158.000000
ExplicitBounds #0: 0.000000
ExplicitBounds #1: 5.000000
ExplicitBounds #2: 10.000000
ExplicitBounds #3: 25.000000
ExplicitBounds #4: 50.000000
ExplicitBounds #5: 75.000000
ExplicitBounds #6: 100.000000
ExplicitBounds #7: 250.000000
ExplicitBounds #8: 500.000000
ExplicitBounds #9: 750.000000
ExplicitBounds #10: 1000.000000
ExplicitBounds #11: 2500.000000
ExplicitBounds #12: 5000.000000
ExplicitBounds #13: 7500.000000
ExplicitBounds #14: 10000.000000
Buckets #0, Count: 0
Buckets #1, Count: 0
Buckets #2, Count: 0
Buckets #3, Count: 0
Buckets #4, Count: 0
Buckets #5, Count: 0
Buckets #6, Count: 0
Buckets #7, Count: 1
Buckets #8, Count: 0
Buckets #9, Count: 0
Buckets #10, Count: 0
Buckets #11, Count: 0
Buckets #12, Count: 0
Buckets #13, Count: 0
Buckets #14, Count: 0
Buckets #15, Count: 0
Exemplars:
Exemplar #0
     -> Trace ID: 2cbbe620b988d9355d18b71180b019ae
     -> Span ID: 4f4c5a1aa66814e3
     -> Timestamp: 2023-12-23 14:07:19.7391379 +0000 UTC
     -> Value: 158.000000
HistogramDataPoints #4
Data point attributes:
     -> JRecordTime: Str(ce11c9d9aa6aa46ab7e5c5525843f072)
StartTimestamp: 2023-12-23 14:06:49.9231703 +0000 UTC
Timestamp: 2023-12-23 14:22:31.7888547 +0000 UTC
Count: 1
Sum: 198.000000
Min: 198.000000
Max: 198.000000
ExplicitBounds #0: 0.000000
ExplicitBounds #1: 5.000000
ExplicitBounds #2: 10.000000
ExplicitBounds #3: 25.000000
ExplicitBounds #4: 50.000000
ExplicitBounds #5: 75.000000
ExplicitBounds #6: 100.000000
ExplicitBounds #7: 250.000000
ExplicitBounds #8: 500.000000
ExplicitBounds #9: 750.000000
ExplicitBounds #10: 1000.000000
ExplicitBounds #11: 2500.000000
ExplicitBounds #12: 5000.000000
ExplicitBounds #13: 7500.000000
ExplicitBounds #14: 10000.000000
Buckets #0, Count: 0
Buckets #1, Count: 0
Buckets #2, Count: 0
Buckets #3, Count: 0
Buckets #4, Count: 0
Buckets #5, Count: 0
Buckets #6, Count: 0
Buckets #7, Count: 1
Buckets #8, Count: 0
Buckets #9, Count: 0
Buckets #10, Count: 0
Buckets #11, Count: 0
Buckets #12, Count: 0
Buckets #13, Count: 0
Buckets #14, Count: 0
Buckets #15, Count: 0
Exemplars:
Exemplar #0
     -> Trace ID: ce11c9d9aa6aa46ab7e5c5525843f072
     -> Span ID: 9b2e1b317a3c82ce
     -> Timestamp: 2023-12-23 14:07:21.0001502 +0000 UTC
     -> Value: 198.000000
HistogramDataPoints #5
Data point attributes:
     -> JRecordTime: Str(a440151b4a17bb51ea5691ff6ae6e38f)
StartTimestamp: 2023-12-23 14:06:49.9231703 +0000 UTC
Timestamp: 2023-12-23 14:22:31.7888547 +0000 UTC
Count: 1
Sum: 157.000000
Min: 157.000000
Max: 157.000000
ExplicitBounds #0: 0.000000
ExplicitBounds #1: 5.000000
ExplicitBounds #2: 10.000000
ExplicitBounds #3: 25.000000
ExplicitBounds #4: 50.000000
ExplicitBounds #5: 75.000000
ExplicitBounds #6: 100.000000
ExplicitBounds #7: 250.000000
ExplicitBounds #8: 500.000000
ExplicitBounds #9: 750.000000
ExplicitBounds #10: 1000.000000
ExplicitBounds #11: 2500.000000
ExplicitBounds #12: 5000.000000
ExplicitBounds #13: 7500.000000
ExplicitBounds #14: 10000.000000
Buckets #0, Count: 0
Buckets #1, Count: 0
Buckets #2, Count: 0
Buckets #3, Count: 0
Buckets #4, Count: 0
Buckets #5, Count: 0
Buckets #6, Count: 0
Buckets #7, Count: 1
Buckets #8, Count: 0
Buckets #9, Count: 0
Buckets #10, Count: 0
Buckets #11, Count: 0
Buckets #12, Count: 0
Buckets #13, Count: 0
Buckets #14, Count: 0
Buckets #15, Count: 0
Exemplars:
Exemplar #0
     -> Trace ID: a440151b4a17bb51ea5691ff6ae6e38f
     -> Span ID: b06f47e199d9b5af
     -> Timestamp: 2023-12-23 14:07:24.8404464 +0000 UTC
     -> Value: 157.000000
HistogramDataPoints #6
Data point attributes:
     -> JRecordTime: Str(ddd1838a66d8b4c542069bfb0a76da55)
StartTimestamp: 2023-12-23 14:06:49.9231703 +0000 UTC
Timestamp: 2023-12-23 14:22:31.7888547 +0000 UTC
Count: 1
Sum: 214.000000
Min: 214.000000
Max: 214.000000
ExplicitBounds #0: 0.000000
ExplicitBounds #1: 5.000000
ExplicitBounds #2: 10.000000
ExplicitBounds #3: 25.000000
ExplicitBounds #4: 50.000000
ExplicitBounds #5: 75.000000
ExplicitBounds #6: 100.000000
ExplicitBounds #7: 250.000000
ExplicitBounds #8: 500.000000
ExplicitBounds #9: 750.000000
ExplicitBounds #10: 1000.000000
ExplicitBounds #11: 2500.000000
ExplicitBounds #12: 5000.000000
ExplicitBounds #13: 7500.000000
ExplicitBounds #14: 10000.000000
Buckets #0, Count: 0
Buckets #1, Count: 0
Buckets #2, Count: 0
Buckets #3, Count: 0
Buckets #4, Count: 0
Buckets #5, Count: 0
Buckets #6, Count: 0
Buckets #7, Count: 1
Buckets #8, Count: 0
Buckets #9, Count: 0
Buckets #10, Count: 0
Buckets #11, Count: 0
Buckets #12, Count: 0
Buckets #13, Count: 0
Buckets #14, Count: 0
Buckets #15, Count: 0
Exemplars:
Exemplar #0
     -> Trace ID: ddd1838a66d8b4c542069bfb0a76da55
     -> Span ID: 642a9d56a6f59833
     -> Timestamp: 2023-12-23 14:07:26.1225342 +0000 UTC
     -> Value: 214.000000
HistogramDataPoints #7
Data point attributes:
     -> JRecordTime: Str(4649e2f715b46ed7c2af4fdb354e5491)
StartTimestamp: 2023-12-23 14:06:49.9231703 +0000 UTC
Timestamp: 2023-12-23 14:22:31.7888547 +0000 UTC
Count: 1
Sum: 173.000000
Min: 173.000000
Max: 173.000000
ExplicitBounds #0: 0.000000
ExplicitBounds #1: 5.000000
ExplicitBounds #2: 10.000000
ExplicitBounds #3: 25.000000
ExplicitBounds #4: 50.000000
ExplicitBounds #5: 75.000000
ExplicitBounds #6: 100.000000
ExplicitBounds #7: 250.000000
ExplicitBounds #8: 500.000000
ExplicitBounds #9: 750.000000
ExplicitBounds #10: 1000.000000
ExplicitBounds #11: 2500.000000
ExplicitBounds #12: 5000.000000
ExplicitBounds #13: 7500.000000
ExplicitBounds #14: 10000.000000
Buckets #0, Count: 0
Buckets #1, Count: 0
Buckets #2, Count: 0
Buckets #3, Count: 0
Buckets #4, Count: 0
Buckets #5, Count: 0
Buckets #6, Count: 0
Buckets #7, Count: 1
Buckets #8, Count: 0
Buckets #9, Count: 0
Buckets #10, Count: 0
Buckets #11, Count: 0
Buckets #12, Count: 0
Buckets #13, Count: 0
Buckets #14, Count: 0
Buckets #15, Count: 0
Exemplars:
Exemplar #0
     -> Trace ID: 4649e2f715b46ed7c2af4fdb354e5491
     -> Span ID: 8e56d310c3b6c052
     -> Timestamp: 2023-12-23 14:07:27.4223362 +0000 UTC
     -> Value: 173.000000
        {"kind": "exporter", "data_type": "metrics", "name": "logging"}

Additional context

Look at the logs of otel-collector-contrib docker container, It successfully log metrics with trace id and span id. This means that it reach to the debug exporter. Also prometheus exporter endpoint https://localhost:8889 (as per configuration above) also display metrics but it miss the exemplars.

@dotnetstep dotnetstep added bug Something isn't working needs triage New item requiring triage labels Dec 23, 2023
Copy link
Contributor

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@crobert-1
Copy link
Member

Hello @dotnetstep, just want to make sure I understand your problem, and we're not missing something simple here. Is the problem that your running prometheus instance is not receiving any exemplars? Or is the problem here that your received exemplars don't have the span_id and trace_id labels?

If you're receiving exemplars, can you confirm that you've checked for the proper naming of the span_id and trace_id labels?

I see that since you're sending monotonic sums they should be converted to exemplars. The code path for this case hits the conversion which adds the trace_id and span_id labels:

if traceID := e.TraceID(); !traceID.IsEmpty() {
exemplarLabels["trace_id"] = hex.EncodeToString(traceID[:])
}
if spanID := e.SpanID(); !spanID.IsEmpty() {
exemplarLabels["span_id"] = hex.EncodeToString(spanID[:])
}

@dotnetstep
Copy link
Author

dotnetstep commented Jan 3, 2024

  • As per my otel configuration file. If Prometheus scrap on port 8889 then it is not receiving exemplars but I am sure it was pass from my application as it was part of debug exporter.

  • I have enable enable_open_metrics: true in exporter.

  • I have tried other way like calling endpoint with header set to Accept: application/openmetrics-text; version=1.0.0; charset=utf-8 and this time it is returning exemplar from same scrapping endpoint. Unfortunately in prometheus we can configure to pass extra header. ( If possible I don't have that much information.)

  • I have tried following in otel-configuration and it works. ( but it has its own issue on request response get terminated sometime.)

receivers:
  otlp:
    protocols:
      grpc:
      http:

processors:
  batch:
    timeout: 1s
    send_batch_size: 1024  

exporters:
  prometheus: # metrics
    endpoint: "0.0.0.0:8889"
    resource_to_telemetry_conversion:
      enabled: true    
    send_timestamps: true
    enable_open_metrics: false

  logging:
    verbosity: detailed

extensions:
  http_forwarder:
    ingress:
      endpoint: 0.0.0.0:7070
    egress:
      endpoint: https://0.0.0.0:8889/
      headers:
        otel_http_forwarder: dev
        Accept: application/openmetrics-text; version=1.0.0; charset=utf-8
      timeout: 60s

service:
  extensions: [http_forwarder]
  pipelines:
    metrics:
      receivers: [otlp]
      processors: [batch]
      exporters: [logging, prometheus]

Now if scrap 7070 endpoint, it call 8889 with extra header and Prometheus able to get exemplar.

I wonder why it is not working directly. @crobert-1

@crobert-1
Copy link
Member

I'm going to defer to others who have more information here for now

@skansalintel
Copy link

I am able to reproduce the same issue with the given steps by @dotnetstep , Any plans to resolve this issue anytime soon or any workaround ?

Copy link
Contributor

github-actions bot commented Apr 8, 2024

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label Apr 8, 2024
@crobert-1 crobert-1 removed the Stale label Apr 8, 2024
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

Copy link
Contributor

This issue has been closed as inactive because it has been stale for 120 days with no activity.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Aug 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants