Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TestZookeeperMetricsScraperScrape is unstable #3034

Closed
tigrannajaryan opened this issue Apr 9, 2021 · 2 comments
Closed

TestZookeeperMetricsScraperScrape is unstable #3034

tigrannajaryan opened this issue Apr 9, 2021 · 2 comments
Assignees
Labels
bug Something isn't working

Comments

@tigrannajaryan
Copy link
Member

Test failed on CI: https://app.circleci.com/pipelines/github/open-telemetry/opentelemetry-collector-contrib/12168/workflows/ac4bf3ff-deb7-4381-89c1-0126236828f3/jobs/104093

--- FAIL: TestZookeeperMetricsScraperScrape (1.92s)
    --- FAIL: TestZookeeperMetricsScraperScrape/Error_setting_connection_deadline (0.11s)
        scraper_test.go:242: 
            	Error Trace:	scraper_test.go:242
            	Error:      	Not equal: 
            	            	expected: 1
            	            	actual  : 0
            	Test:       	TestZookeeperMetricsScraperScrape/Error_setting_connection_deadline
FAIL
FAIL	github.com/open-telemetry/opentelemetry-collector-contrib/receiver/zookeeperreceiver	2.015s
?   	github.com/open-telemetry/opentelemetry-collector-contrib/receiver/zookeeperreceiver/internal/metadata	[no test files]
FAIL
@tigrannajaryan tigrannajaryan added the bug Something isn't working label Apr 9, 2021
@tigrannajaryan
Copy link
Member Author

@asuresh4 can you please have a look? I believe you are the author of the test.

@asuresh4
Copy link
Member

@tigrannajaryan - I'm unable to reproduce this locally so far. But I believe the increased timeout in the PR attached should help to avoid premature assertions.

tigrannajaryan pushed a commit that referenced this issue Apr 30, 2021
Attempt to make test stable by waiting till the listener is setup.

**Link to tracking Issue:** #3034
alexperez52 referenced this issue in open-o11y/opentelemetry-collector-contrib Aug 18, 2021
Benchmarks Before:
```
goos: darwin
goarch: amd64
pkg: go.opentelemetry.io/collector/processor/batchprocessor
cpu: Intel(R) Core(TM) i9-9880H CPU @ 2.30GHz
BenchmarkSplitMetrics
BenchmarkSplitMetrics-16    	    6660	    170653 ns/op	  182889 B/op	    3309 allocs/op
PASS

Process finished with the exit code 0
```

Benchmarks After:
```
goos: darwin
goarch: amd64
pkg: go.opentelemetry.io/collector/processor/batchprocessor
cpu: Intel(R) Core(TM) i9-9880H CPU @ 2.30GHz
BenchmarkSplitMetrics
BenchmarkSplitMetrics-16    	    7858	    134259 ns/op	  141881 B/op	    2596 allocs/op
PASS

Process finished with the exit code 0
```

Benchmarks Reference Clone:
```
goos: darwin
goarch: amd64
pkg: go.opentelemetry.io/collector/processor/batchprocessor
cpu: Intel(R) Core(TM) i9-9880H CPU @ 2.30GHz
BenchmarkCloneMetrics
BenchmarkCloneMetrics-16    	    8726	    127948 ns/op	  137816 B/op	    2503 allocs/op
PASS

Process finished with the exit code 0
```

Signed-off-by: Bogdan Drutu <[email protected]>
mstumpfx pushed a commit to mstumpfx/opentelemetry-collector-contrib that referenced this issue Aug 31, 2021
Attempt to make test stable by waiting till the listener is setup.

**Link to tracking Issue:** open-telemetry#3034
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants