Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[processor/metricstransform]: Add scaling exponential histogram support #34039

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

wildum
Copy link

@wildum wildum commented Jul 11, 2024

Description: This PR adds support for the exponential histograms for the experimental_scale_value in the metricstransformprocessor.

The scaling works by scaling the middle value of the first bucket (the one that corresponds to the offset) and finding the offset corresponding to this new value (the method used is described here: https://opentelemetry.io/docs/specs/otel/metrics/data-model/#all-scales-use-the-logarithm-function).

The buckets are actually unchanged because they are "shifted" by the new offset. I initially remapped all the values but I ended up always having the same buckets so I left the buckets untouched to make the code simpler and save on performance.

Link to tracking Issue: #29803

Testing: unit test + e2e local test

@wildum wildum requested review from dmitryax and a team as code owners July 11, 2024 09:52
@github-actions github-actions bot added the processor/metricstransform Metrics Transform processor label Jul 11, 2024
Copy link
Member

@jpkrohling jpkrohling left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The code looks sane, but I need help from someone who understands about histograms.

@carrieedwards, are you available to review this one?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
processor/metricstransform Metrics Transform processor
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants