Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Otel Processor metric filter is not working as expected #32982

Closed
VJ1313 opened this issue May 10, 2024 · 4 comments
Closed

Otel Processor metric filter is not working as expected #32982

VJ1313 opened this issue May 10, 2024 · 4 comments
Labels
bug Something isn't working closed as inactive needs triage New item requiring triage processor/filter Filter processor Stale waiting for author

Comments

@VJ1313
Copy link

VJ1313 commented May 10, 2024

Component(s)

No response

Describe the issue you're reporting

This is reference to this ticket - #28585

We are using aws managed prometheus.

Can someone help here?

I am trying to stop entire bucket but its not working. Basically dont need any data from this bucket - http_server_duration_milliseconds_bucket

receivers:
otlp:
protocols:
grpc: null
http: null
processors:
memory_limiter:
limit_mib: 20
check_interval: 5s
batch: null
filter/uri:
error_mode: ignore
metrics:
metric:

  • IsMatch(name, "http_server_requests_.")
    datapoint:
  • IsMatch(attributes["uri"], "./health.")
  • IsMatch(attributes["uri"], "./version.")
  • IsMatch(attributes["uri"], ". /actuator.*")
    filter/http:
    error_mode: ignore
    metrics:
  • name == "http_server_duration_milliseconds_bucket"
    datapoint:
  • attributes["net_protocol_name"] == "http"
@VJ1313 VJ1313 added the needs triage New item requiring triage label May 10, 2024
@crobert-1 crobert-1 added processor/filter Filter processor bug Something isn't working labels May 10, 2024
Copy link
Contributor

Pinging code owners for processor/filter: @TylerHelmuth @boostchicken. See Adding Labels via Comments if you do not have permissions to add labels yourself.

@crobert-1
Copy link
Member

The best way to start is to add a debug exporter to your pipeline, and see what the metrics look like coming out of the collector. This will help determine if the metric names are matching what the filter processor is expecting.

Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

Copy link
Contributor

This issue has been closed as inactive because it has been stale for 120 days with no activity.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Sep 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working closed as inactive needs triage New item requiring triage processor/filter Filter processor Stale waiting for author
Projects
None yet
Development

No branches or pull requests

2 participants