Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[receiver/prometheusreceiver] Allow for the alternate stringlabels implementation of labels to be usable #31907

Closed
braydonk opened this issue Mar 22, 2024 · 7 comments
Assignees
Labels
enhancement New feature or request receiver/prometheus Prometheus receiver

Comments

@braydonk
Copy link
Contributor

Component(s)

receiver/prometheus

Is your feature request related to a problem? Please describe.

In the Prometheus SDK, there is an alternate implementation of the labels.Labels type that uses a string with length encodings to represent a collection of labels instead of a slice. In my non-rigorous testing I saw some pretty reasonable improvements when stress testing a collector built with -tags=stringlabels.

However I don't mean for this issue to be an argument about whether or not this should be used/how. This issue is intended to be about how the current implementation of the Prometheus Receiver assumes labels.Labels to be a slice under the hood, making it so building with -tags=stringlabels is impossible.

Describe the solution you'd like

labels.Labels has a set of public methods that allows you to interact with it in the same way regardless of which implementation it's built with. I'd like to change the spots that make the slice assumption to use the public methods to an equivalent result.

This issue can be considered closed when it is possible to build a collector with this receiver using the stringlabels build tag.

Describe alternatives you've considered

No response

Additional context

I learned about this alternate implementation of labels and why it improves memory usage from this OpenObservability Day 2023 talk: https://www.youtube.com/watch?v=29yKJ1312AM

@braydonk braydonk added enhancement New feature or request needs triage New item requiring triage labels Mar 22, 2024
Copy link
Contributor

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the receiver/prometheus Prometheus receiver label Mar 22, 2024
@braydonk braydonk changed the title Allow for the alternate stringlabels implementation of stringlabels to be usable [receiver/prometheusreceiver] Allow for the alternate stringlabels implementation of stringlabels to be usable Mar 22, 2024
@braydonk braydonk changed the title [receiver/prometheusreceiver] Allow for the alternate stringlabels implementation of stringlabels to be usable [receiver/prometheusreceiver] Allow for the alternate stringlabels implementation of labels to be usable Mar 22, 2024
@dashpole
Copy link
Contributor

However I don't mean for this issue to be an argument about whether or not this should be used/how.

Can we track that question somewhere? I have no objections to using the public methods, but would love to make sure the performance improvements land. Will that build tag be the default someday, or would we need to update the core/contrib builds to use it?

@bwplotka do you know if anyone is exploring this optimization for the prometheus server?

@braydonk
Copy link
Contributor Author

Can we track that question somewhere? I have no objections to using the public methods, but would love to make sure the performance improvements land.

I can make another issue to track that question. I am not sure where I personally stand on how this should be adopted by an upstream release if at all. That's why I opted to position this only as making it possible to build with stringlabels, so that distributions or custom builds of the collector could opt-in. However it could also be a good option to provide a contrib release with this tag on as a special case.

do you know if anyone is exploring this optimization for the prometheus server?

The reason I brought up that option above is because that seems to be what upstream Prometheus does with this. For a while they distributed alternative releases with stringlabels enabled, i.e. https://github.com/prometheus/prometheus/releases/tag/v2.43.0%2Bstringlabels
However it seems they stopped doing that, so maybe that is the default now? They do seem to be doing that with the other new labels implementation dedupelabels: https://github.com/prometheus/prometheus/releases/tag/v2.51.0%2Bdedupelabels
Haven't messed with that at all yet, wanted to go after stringlabels first since it's been around longer and seems well battle tested from what I've found.

dmitryax pushed a commit that referenced this issue Mar 27, 2024
…#31908)

**Description:**
By only using the public method API for Prometheus labels (rather than
assuming `labels.Labels` is an alias of a slice) it opens up the
possibility to build a collector with the `stringlabels` tag, so we can
use the more memory efficient labels implementation.

**Link to tracking Issue:** #31907

**Testing:** 
I had trouble running all of the tests locally, so I think I will need
some help with making that work. I did run all the tests I changed with
`-tags=stringlabels` and without it.
rimitchell pushed a commit to rimitchell/opentelemetry-collector-contrib that referenced this issue May 8, 2024
…open-telemetry#31908)

**Description:**
By only using the public method API for Prometheus labels (rather than
assuming `labels.Labels` is an alias of a slice) it opens up the
possibility to build a collector with the `stringlabels` tag, so we can
use the more memory efficient labels implementation.

**Link to tracking Issue:** open-telemetry#31907

**Testing:** 
I had trouble running all of the tests locally, so I think I will need
some help with making that work. I did run all the tests I changed with
`-tags=stringlabels` and without it.
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label May 22, 2024
@braydonk
Copy link
Contributor Author

Unstale (is that possible through comments in this repo?)

This is still relevant.

@dashpole dashpole removed Stale needs triage New item requiring triage labels May 22, 2024
@dashpole
Copy link
Contributor

Are there any other changes required after #31908? Or can this be closed?

@braydonk
Copy link
Contributor Author

Oh I forgot that the PR was closed. Then this issue can probably be closed; there's probably more to look into in terms of providing a build of the collector that has this enabled but that can probably go into a new issue instead. For my use case this is good enough as I'll build it with the stringlabels tag myself.

Closing this now, thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request receiver/prometheus Prometheus receiver
Projects
None yet
Development

No branches or pull requests

2 participants