Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pytest.skip will skip entire files when used as a decorator #607

Closed
pytestbot opened this issue Oct 3, 2014 · 16 comments
Closed

pytest.skip will skip entire files when used as a decorator #607

pytestbot opened this issue Oct 3, 2014 · 16 comments
Labels
good first issue easy issue that is friendly to new contributor type: bug problem that needs to be addressed
Milestone

Comments

@pytestbot
Copy link
Contributor

Originally reported by: Bryce Lampe (BitBucket: blampe, GitHub: blampe)


I've seen several people assume that @pytest.skip can be used in place of @pytest.mark.skipif(True, reason="foo"). It seems like a reasonable convenience and it seems to work. However, this is actually quite dangerous because it has the unexpected side-effect of skipping every test in the file:

$ cat test.py
import pytest

@pytest.skip
def test_1():
    print 1

def test_2():
    print 2

def test_3():
    print 3

Expected behavior:

$ py.test test.py -v
============================= test session starts ==============================
platform darwin -- Python 2.7.8 -- py-1.4.25 -- pytest-2.6.3 -- /Users/blampe/env/bin/python2.7
collected 2 items

test.py::test_2 PASSED
test.py::test_3 PASSED

=========================== 2 passed in 0.01 seconds ===========================

Actual behavior:

$ py.test test.py -v
============================= test session starts ==============================
platform darwin -- Python 2.7.8 -- py-1.4.25 -- pytest-2.6.3
collected 0 items / 1 skipped

========================== 1 skipped in 0.00 seconds ==========================

Note that this is reporting the wrong number of skipped tests, so this definitely seems unintentional.


@pytestbot
Copy link
Contributor Author

Original comment by holger krekel (BitBucket: hpk42, GitHub: hpk42):


What about making pytest.skip fail when a function object is passed and point to "pytest.mark.skipif" and the docs?

@pytestbot
Copy link
Contributor Author

Original comment by Floris Bruynooghe (BitBucket: flub, GitHub: flub):


Sounds like a reasonable solution to me.

@pytestbot
Copy link
Contributor Author

Original comment by Kiril Vladimiroff (BitBucket: Vladimiroff, GitHub: Vladimiroff):


If there is going to be a check like this why not simply do pytest.mark.skipif(True, msg) if a function object is passed to pytest.skip?

@pytestbot
Copy link
Contributor Author

Original comment by Bryce Lampe (BitBucket: blampe, GitHub: blampe):


I agree with Kiril. I think there's value in an unconditional skip, and I've seen several people assume that this is the existing behavior.

@pytestbot
Copy link
Contributor Author

Original comment by holger krekel (BitBucket: hpk42, GitHub: hpk42):


From a API design perspective I dislike mixing pytest.mark.skipif and pytest.skip because it mixes a "declarative" (mark it as being skipped) and an "imperative" (skip it now) way of doing things. Bailing out with a clear error, pointing to use "mark.skipif" allows to keep the conceptual separation which is better in the long run IMO.

@pytestbot
Copy link
Contributor Author

Original comment by Kiril Vladimiroff (BitBucket: Vladimiroff, GitHub: Vladimiroff):


Fair point, but from usage perspective I find it awkward to do @pytest.mark.skipif(True, reason="...") instead of @pytest.skip.

@pytestbot
Copy link
Contributor Author

Original comment by Bryce Lampe (BitBucket: blampe, GitHub: blampe):


Bailing out with a clear error, pointing to use "mark.skipif" allows to keep the conceptual separation which is better in the long run IMO.

I don't think there is any such conceptual separation for users of the API, though. This is largely because

  • Both methods can already be used declaratively, giving the appearance that this works as expected.
  • Declarative syntax for @pytest.skip mimics existing libraries.

In fact, writing @pytest.mark.skipif(True, ... is so awkward that I regularly see people use @unittest.skip instead, only because it does what they need, it's compatible with pytest methods, and they're aware of this bug. I say give the people what they want :)

@pytestbot
Copy link
Contributor Author

Original comment by Anatoly Bubenkov (BitBucket: bubenkoff, GitHub: bubenkoff):


I agree with Holger on that it's bad idea to mix conceptions
I would stick to markers approach:
introduce new marker - pytest.mark.skip(reason=None), which is a shortcut of pytest.mark.skipif(True, reason=reason)
this way we'll simplify the easy task of skipping certain test
and then let's strictly prevent using pytest.skip as a decorator

@pytestbot
Copy link
Contributor Author

Original comment by Floris Bruynooghe (BitBucket: flub, GitHub: flub):


I think the case is made that the pytest.skip() API is a little confusing with respect to people migrating from other unittest frameworks. However I also think that making pytest.skip() work as a decorator breaks py.test's consistency where the clear pattern is that all decorators to be applied on test functions are @pytest.mark.xxx. I think starting to break this now will just make confusion worse rather then better.

So with that in mind I am also in favour of detecting @pytest.skip and referring to @pytest.mark.skip (I think @bubenkoff's idea of that shortcut might be a good plan) as @hpk42 suggests.

If that is still too confusing the I'd suggest that py.test should drop the pytest.skip() and pytest.fail() API completely, going with raise pytest.SkipException(msg) and raise pytest.FailException(msg) instead but that's majorly backwards-incompatible so not a very viable thing either.

@pytestbot
Copy link
Contributor Author

Original comment by Kiril Vladimiroff (BitBucket: Vladimiroff, GitHub: Vladimiroff):


@flub, great then my pull request suits fine until the @pytest.mark.skip decorator is not yet ready :)

@pytestbot
Copy link
Contributor Author

Original comment by Anatoly Bubenkov (BitBucket: bubenkoff, GitHub: bubenkoff):


that's not how it works im affraid
we cannot just merge it 'until'
but you can create another PR implementing discussed solution :)

@pytestbot pytestbot added the type: bug problem that needs to be addressed label Jun 15, 2015
@RonnyPfannschmidt RonnyPfannschmidt added the good first issue easy issue that is friendly to new contributor label Jul 25, 2015
@RonnyPfannschmidt RonnyPfannschmidt modified the milestones: 2.8, 2.8.dev Sep 13, 2015
@RonnyPfannschmidt RonnyPfannschmidt modified the milestones: 2.8.2, 2.8.1 Sep 27, 2015
@nicoddemus
Copy link
Member

FWIW the implementation for pytest.mark.skip was just merged in the features branch, target for the 2.9 release. 😄

@RonnyPfannschmidt RonnyPfannschmidt modified the milestones: 2.8.2, 2.8.3 Oct 10, 2015
@RonnyPfannschmidt
Copy link
Member

@nicoddemus @hpk should we issue a pytest_warning if pytest.skip happens outside of the runtestprotocol?

@nicoddemus
Copy link
Member

@nicoddemus @hpk should we issue a pytest_warning if pytest.skip happens outside of the runtestprotocol?

Sounds good, but as we commented before, the current warnings system could use some improvements.

@nicoddemus nicoddemus removed this from the 2.8.3 milestone Nov 23, 2015
This was referenced Nov 11, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue easy issue that is friendly to new contributor type: bug problem that needs to be addressed
Projects
None yet
Development

No branches or pull requests

3 participants