Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[chore] testbed: sending queue otlp #28904

Merged
merged 10 commits into from
Jan 4, 2024

Conversation

omrozowicz-splunk
Copy link
Contributor

Description: This PR adds two tests with an otlp receiver and exporter with sending queues. We have two scenarios here:

Sending queue full

  1. We generate permanent errors until sending_queue is full log appears in the agent's logs
  2. Then we get IDs of logs meant to be retried and IDs of logs received successfully and check if all of them were retried

The current testbed is unable to get the information about the errors from load generator's perspective, so I needed to put LogsToRetry in mock_backend to be able to track what logs suffered from permanent error.

Sending queue not full
Sanity test to check a default behavior of sending queue, but without making it full.

So far only logs sending queues are covered, not sure if we should add it for every data type. Currently, one test takes about ~9s.

Link to tracking Issue: A related issue is this one: #20552, as these tests cover Exporter helper QueuedRetry queue limit size is hit. scenario

@@ -322,3 +324,35 @@ func (tc *TestCase) logStatsOnce() {
tc.LoadGenerator.GetStats(),
tc.MockBackend.GetStats())
}

// Used to search for text in agent.log
// It can be used to verify if we've hit QueuedRetry sender or memory limitter
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
// It can be used to verify if we've hit QueuedRetry sender or memory limitter
// It can be used to verify if we've hit QueuedRetry sender or memory limiter

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed


// Used to search for text in agent.log
// It can be used to verify if we've hit QueuedRetry sender or memory limitter
func (tc *TestCase) SearchText(text string) bool {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
func (tc *TestCase) SearchText(text string) bool {
func (tc *TestCase) AgentLogsContains(text string) bool {

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

Copy link
Contributor

This PR was marked stale due to lack of activity. It will be closed in 14 days.

@github-actions github-actions bot added the Stale label Nov 30, 2023
@omrozowicz-splunk
Copy link
Contributor Author

Hey @mx-psi, can you take a look? 🥺

@mx-psi mx-psi requested a review from atoulme November 30, 2023 17:42
@github-actions github-actions bot removed the Stale label Dec 1, 2023
Copy link
Contributor

This PR was marked stale due to lack of activity. It will be closed in 14 days.

@github-actions github-actions bot added the Stale label Dec 16, 2023
@github-actions github-actions bot removed the Stale label Dec 19, 2023
Copy link
Member

@mx-psi mx-psi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey, sorry I didn't get to this sooner 😅 I won't be active over the holidays, but can review when I am back. I merged main into your PR and it looks like there are a few linting errors.

Would also want @atoulme to have a look as well at this before merging

@mx-psi mx-psi merged commit 2fe33d3 into open-telemetry:main Jan 4, 2024
85 checks passed
@github-actions github-actions bot added this to the next release milestone Jan 4, 2024
cparkins pushed a commit to AmadeusITGroup/opentelemetry-collector-contrib that referenced this pull request Jan 10, 2024
**Description:** This PR adds two tests with an otlp receiver and
exporter with sending queues. We have two scenarios here:

Sending queue full
1. We generate permanent errors until `sending_queue is full` log
appears in the agent's logs
2. Then we get IDs of logs meant to be retried and IDs of logs received
successfully and check if all of them were retried

The current testbed is unable to get the information about the errors
from load generator's perspective, so I needed to put `LogsToRetry` in
`mock_backend` to be able to track what logs suffered from permanent
error.

Sending queue not full
Sanity test to check a default behavior of sending queue, but without
making it full.

So far only logs sending queues are covered, not sure if we should add
it for every data type. Currently, one test takes about ~9s.

**Link to tracking Issue:** A related issue is this one:
open-telemetry#20552,
as these tests cover `Exporter helper QueuedRetry queue limit size is
hit.` scenario

---------

Co-authored-by: Pablo Baeyens <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants