Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[TEST][FLAKY] darknet/test_forward.py::test_forward_conv_batch_norm #4284

Closed
tqchen opened this issue Nov 8, 2019 · 3 comments
Closed

[TEST][FLAKY] darknet/test_forward.py::test_forward_conv_batch_norm #4284

tqchen opened this issue Nov 8, 2019 · 3 comments

Comments

@tqchen
Copy link
Member

tqchen commented Nov 8, 2019

https://ci.tvm.ai:8080/blue/rest/organizations/jenkins/pipelines/tvm/branches/master/runs/12/nodes/239/log/?start=0

@tqchen tqchen changed the title [TEST][FLAKY] tests/python/frontend/darknet/test_forward.py::test_forward_conv_batch_norm [TEST][FLAKY] darknet/test_forward.py::test_forward_conv_batch_norm Nov 8, 2019
@tqchen
Copy link
Member Author

tqchen commented Nov 8, 2019

cc @srkreddy1238 can you take a look?

@yongwww
Copy link
Member

yongwww commented Nov 9, 2019

@siju-samuel seems the flaky issue was introduced by DarkNet itself? test cases for batch_norm in other frontends are working well even w/ much smaller tolerance (tvm.testing.assert_allclose(x, tvm_out, rtol=1e-5, atol=1e-5)).

@tqchen
Copy link
Member Author

tqchen commented Dec 23, 2019

could due to random number for variance issue, introduced a solution in #4571, close for now

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants