Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BUG: Travis builds are failing due to "pip install cython" #199

Closed
pcmoore opened this issue Jan 8, 2020 · 9 comments
Closed

BUG: Travis builds are failing due to "pip install cython" #199

pcmoore opened this issue Jan 8, 2020 · 9 comments

Comments

@pcmoore
Copy link
Member

pcmoore commented Jan 8, 2020

When merging PRs today I noticed that Travis started failing due to problems installing Cython via pip:

We need to fix this soon.

@pcmoore
Copy link
Member Author

pcmoore commented Jan 8, 2020

I tried a quick fix of returning the build to the default Python version (v3.6 as of now), but with that we are now seeing a Python test failure.

Travis patch:

Author: Paul Moore <[email protected]>
Date:   Wed Jan 8 00:47:03 2020 -0500

    travis: use the default Python version
    
    Signed-off-by: Paul Moore <[email protected]>

diff --git a/.travis.yml b/.travis.yml
index da1cde5..723a885 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -16,8 +16,6 @@ compiler:
   - gcc
 
 language: python
-python:
-  - "nightly"
 
 addons:
   coverity_scan:

Test failure: https://travis-ci.org/pcmoore/misc-libseccomp/builds/634088624

 batch name: 24-live-arg_allow
 test mode:  python
 test type:  live
Test 24-live-arg_allow%%001-00001 result:   FAILURE 24-live-arg_allow 1 ALLOW rc=159

@pcmoore pcmoore added this to the v2.4.3 milestone Jan 8, 2020
@drakenclimber
Copy link
Member

Yuck - I agree this is high priority. I'll try and help debug this today.

@drakenclimber
Copy link
Member

I have hacked around with this a little bit, and I don't like what I have found. As best as I can tell, test 24 seems dependent upon what has been run before it.

I created a branch and I have two patches in it.

Patch 1 - The above patch to disable the nightly python - Automated tests fail (due to test 24)
Patch 2 - Only run test 24 - Automated tests (i.e. test 24) pass

@pcmoore
Copy link
Member Author

pcmoore commented Jan 12, 2020

I have hacked around with this a little bit, and I don't like what I have found. As best as I can tell, test 24 seems dependent upon what has been run before it.

Well that is really annoying, and not something that makes much sense. Sigh. Thanks for playing with it; the latter half of last week was a bit bonkers for me, and I expect the rest of January will be the same.

@whereswaldon has a fix in #201, but I'm starting to think the proper fix for Travis might be to just skip the Python live tests and only run the native/C live tests; otherwise I'm afraid we will continue to have sporadic test failures due to Python changes in the Travis environment. The Python bindings are a thin layer that doesn't actually generate any BPF, that's left to the C code. As long as we continue to run the simulated Python tests and the native/C live tests I think we should be okay, thoughts?

FWIW, I consider Travis to be a bit of a special case and not always indicative of real distros, but the benefits of CI outweigh the risks due to the difference. Further, I would still expect us to run the live tests as part of the regular release process testing.

@drakenclimber
Copy link
Member

As long as we continue to run the simulated Python tests and the native/C live tests I think we should be okay, thoughts?

Yes, I think this is the best option available to us.

@whereswaldon
Copy link
Contributor

So we'd be skipping only running the live shim layer that invokes libseccomp from python?

What are the "simulated Python tests"? I'm trying to understand the different kinds of testing in use in the repo. I assume that the live tests run against the kernel underneath them, but what do that simulated tests do? Does it fake the seccomp syscall or something?

@drakenclimber
Copy link
Member

drakenclimber commented Jan 14, 2020

So we'd be skipping only running the live shim layer that invokes libseccomp from python?

Correct.

What are the "simulated Python tests"? I'm trying to understand the different kinds of testing in use in the repo. I assume that the live tests run against the kernel underneath them, but what do that simulated tests do? Does it fake the seccomp syscall or something?

Good questions. There are several test types in libseccomp:

  • simulated tests - These tests run the entire libseccomp stack against a simulated BPF parser and do not actually load the seccomp/BPF program into the kernel. Any test in the /tests folder named XX-sim-*.py is of this category
  • basic tests - These tests exercise basic libseccomp functionality and likely do not involve actual BPF logic. Rather these tests ensure libseccomp flags, settings, and APIs are behaving properly. Any test named XX-basic-*.py is of this category
  • live tests - As you surmised, these tests actually load the seccomp/BPF filter into the kernel and verity real runtime behavior. We don't have a lot of these tests because they are obviously kernel (and potentially distribution) dependent. Also, the goal of the test suite is to ensure libseccomp is behaving properly; bugs in the kernel are obviously important, but they aren't the responsibility of this test suite. Any test named XX-live-*.py is of this category

Finally, all of the above tests can test the C APIs or the python APIs. You will typically see two copies of a test (a *.c and a *.py) in our test directory.

@pcmoore is proposing that we no longer run the live python tests since they are encountering errors outside of our control. Given that we're still running live C tests and simulated python tests, I feel like this is probably our best approach.

@whereswaldon
Copy link
Contributor

I've attempted to implement this proposal in #202

@pcmoore
Copy link
Member Author

pcmoore commented Jan 20, 2020

This should be fixed in #202, closing for now.

@pcmoore pcmoore closed this as completed Jan 20, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants