Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add pytest.mark.skip shortcut (Issue #607) #1040

Closed
wants to merge 35 commits into from

Conversation

MichaelAquilina
Copy link
Contributor

Add a non-conditional skip marker for convenience as requested in Issue #607.

I will also probably be adding another feature which automatically treats @pytest.skip as pytest.mark.skip if it detects itself being used as a decorator. I was thinking that this should probably be opened as a separate pull request though unless you don't mind it being included in here?

Ran python runtox.py -e py27,py34,flakes to test changes.

This work is a result of the pyconuk pytest sprint! :)

@MichaelAquilina MichaelAquilina changed the title #607 Add pytest.mark.skip shortcut Add pytest.mark.skip shortcut (Issue #607) Sep 21, 2015
@MichaelAquilina
Copy link
Contributor Author

I'll fix the tests as soon as I can.

I assume I'll need to update the docs for this?

@RonnyPfannschmidt
Copy link
Member

for pytest.skip/xfail being used in such a way,
we should first intoduce a global concept of are we currently in a test call

the docs certainly need updates

the failures are related to backward compat around twisted trial, i suggest you install twisted in your venv while fixing up the interaction

good initial work :)

@flub
Copy link
Member

flub commented Sep 21, 2015

For part 2, if pytest.skip is used as a decorator it should not behave as pytest.mark.skip, instead it should result in a collection error. Maybe simply raising an exception is sufficient for this.

@RonnyPfannschmidt
Copy link
Member

instead of teaching skip to return a mark, we could have collection catch skip errors and give a warning

if eval_skipif.istrue():
item._evalskip = eval_skipif
pytest.skip(eval_skipif.getexplanation())
elif isinstance(item.keywords.get('skip'), MarkInfo):
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe I should add a comment explaining why this is needed? Or do you feel its unnecessary?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please do, seems "magical" 😉

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the work! I noticed two problems while playing around with it however:

  • reason is not handled when missing, so the user sees this when using a plain @pytest.mark.skip:

    SKIP [1] test_xx.py:2: <Skipped instance>
    
  • the user could still use pytest.mark.skip('False') and the test would not skip.

This fixes it, but it is not pretty 😅:

@pytest.hookimpl(tryfirst=True)
def pytest_runtest_setup(item):
    evalskip = MarkEvaluator(item, 'skip')
    if evalskip.holder is not None:
        evalskip.reason = evalskip.holder.kwargs.get('reason', 'explicit skip')
        evalskip.result = True
        item._evalskip = evalskip
        pytest.skip(evalskip.getexplanation())

    evalskip = MarkEvaluator(item, 'skipif')
    if evalskip.istrue():
        item._evalskip = evalskip
        pytest.skip(evalskip.getexplanation())
    ...

The implementations lets the user don't pass any reason in which case it will use explicit skip... perhaps we want to enforce it?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah yes, sorry for not catching this. Will give it a closer look and see if there's a nice way to fix this. I'll also make sure my tests are updated to prevent this from happening again.

@MichaelAquilina
Copy link
Contributor Author

@RonnyPfannschmidt could you elaborate a bit more? Not sure that I understand your comment

@MichaelAquilina
Copy link
Contributor Author

@fub I guess that provides a more consistent front in terms of the way pytest works. But #607 seems to conclude that @pytest.skip should act like @pytest.mark.skip if used as a decorator? I dont mind either way.

@RonnyPfannschmidt
Copy link
Member

@MichaelAquilina when collection catches a pytest.skip.Exception, we should do a config.warn(...)

@MichaelAquilina
Copy link
Contributor Author

@RonnyPfannschmidt should I open a separate pull request for that seeing as its slightly unrelated?

@RonnyPfannschmidt
Copy link
Member

@MichaelAquilina yes please

@@ -29,8 +29,16 @@ corresponding to the "short" letters shown in the test progress::
Marking a test function to be skipped
-------------------------------------------

The simplest way to skip a test function is to mark it with the `skip` decorator
(added in 2.8) which may be passed an optional `reason`:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you might want to use the versionadded rst directive there

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That would be:

 Marking a test function to be skipped
 -------------------------------------------

... versionadded:: 2.9

The simplest way to skip ...

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Btw good job on the docs! 👍

@RonnyPfannschmidt RonnyPfannschmidt added this to the 2.8.1 milestone Sep 21, 2015
@hpk42
Copy link
Contributor

hpk42 commented Sep 22, 2015

FWIW i don't mind pytest.skip detecting if it's used as a decorator and erroring out -- if this can be done safely and doesn't complicate the implementation (pytest.mark is already a bit messy if you ask me). It should not silently behave like a marker, however.

As to if it can be safely done: usually pytest.skip takes a message -- how can you in its function body then safely know if it's used as a decorator?

@nicoddemus
Copy link
Member

Just a reminder, this should go into the features (2.9) branch.

@nicoddemus
Copy link
Member

@flub

For part 2, if pytest.skip is used as a decorator it should not behave as pytest.mark.skip, instead it should result in a collection error. Maybe simply raising an exception is sufficient for this.

👍

@RonnyPfannschmidt

instead of teaching skip to return a mark, we could have collection catch skip errors and give a warning

I'm not a big fan of this... warnings by default don't explicitly state what happened (you must pass -rw to see the message). Imagine a new user evaluating/playing around with pytest, writing a single test using @pytest.skip instead of @pytest.mark.skip: he would see that no tests were ran, and a misterious 1 pytest-warning, which is not obvious to find. I think an explicit error would convey better the misuse.

(I just realized that the output of pytest --help for -r chars still shows: (w) warnings instead of (w)pytest-warnings... 😳 will fix that right away).

@hpk42

It should not silently behave like a marker, however.

👍

pass
""")
rec = testdir.inline_run()
rec.assertoutcome(skipped=1)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also check the reason here please

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How would I check the reason here if it's not specified though? Is matching the string "skipped instance" good enough?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I assume the reason in this case was "False", right? I mean, the only possible parameter for skip is a reason, differently from skipif which receives (condition, reason). So I meant that you should also use fnmatch_linesto ensure the expected message is being displayed.

@MichaelAquilina
Copy link
Contributor Author

Thanks for all the feedback guys! Will probably get a chance to work on this sometime this evening or tomorrow. I'll look through all your comments and update as necessary :)

@MichaelAquilina
Copy link
Contributor Author

@nicoddemus should this be marked for 2.8.1 or 2.9 in the docs seeing as this pull request is marked for the 2.8.1 milestone?

@nicoddemus
Copy link
Member

As it is a new feature, 2.9 please. New features, even minor ones, should be released a new major version, otherwise people have to depend on pytest>=2.8.1 which is a little weird. I will update the milestone. 😄

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

7 participants