Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP Change outcome to 'passed' for xfail unless it's strict #1795

Merged
merged 17 commits into from
Aug 18, 2016

Conversation

hackebrot
Copy link
Member

I am not entirely sure if this is the right way to do it.

This is very much WIP as the tests will be failing.

Can anybody help me out with this one, please 😃

Resolves #1546

@nicoddemus
Copy link
Member

Hey @hackebrot

I think you are in the right direction; I took a quick look on the failing junitxml tests, but I believe they were not entirely correct in the first place, or at least the new meaning makes more sense.

I will review this more carefully tomorrow as I'm short on time right now.

@hackebrot
Copy link
Member Author

Thanks @nicoddemus! 🙇

Have a nice evening. 😄

@@ -260,7 +266,7 @@ def pytest_report_teststatus(report):
if hasattr(report, "wasxfail"):
if report.skipped:
return "xfailed", "x", "xfail"
elif report.failed:
elif report.passed:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@fabioz I think this will mean you no longer have to do special handling for this in pydev_runfiles_pytest2.py, correct?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@nicoddemus do you think we need to check for strict_xfail here too?

Copy link
Member

@nicoddemus nicoddemus Aug 15, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No need: strict xfails now have report set to failed and will appear as a normal failure.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

okay, that's what I thought. Thanks!

@nicoddemus
Copy link
Member

We should also add somewhere in the docs that the skipping plugin adds a wasxfail attribute to TestReport objects, although I'm not sure where the best place to do that would be.

@@ -245,7 +246,12 @@ def pytest_runtest_makereport(item, call):
rep.outcome = "skipped"
rep.wasxfail = evalxfail.getexplanation()
elif call.when == "call":
rep.outcome = "failed" # xpass outcome
strict_default = item.config.getini('xfail_strict')
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm I think this is the more correct implementation of "strict xfail" than the one I did... with this, I think we don't even need check_strict_xfail (which is used in pytest_pyfunc_call) in this module anymore at all, because failure/passing will be handled here which seems more appropriate now. 😄

@hackebrot
Copy link
Member Author

Hey @nicoddemus! Thank you so much for your thorough code review.

I am currently on vacation (on some Scottish island 😉). Feel free to pick this up.

That'd be awesome, so get we can get 3.0 released soon-ish 😁

@hackebrot
Copy link
Member Author

Hey @nicoddemus, I've added some changes. Please let me know your thoughts! 😄

@nicoddemus
Copy link
Member

@hackebrot thanks! The changes look good, we just have to fix the junitxml tests. They broke because we have changed the semantics slightly: "xpass" for non-strict xfail now became a "passing test", instead of a "skipped test". I'm not sure, it seems sensible to me.

@hackebrot
Copy link
Member Author

Should an XPASS in non-strict mode be:

  • a skipped test
  • a passed test

(It's a failure in strict-mode)

@pytest-dev/core what do you think? 😄

@The-Compiler
Copy link
Member

XPASS being a skipped test sounds really weird - what's the rationale of that? The test did run (and passed), so surely it wasn't skipped? So +1 for passed.

@RonnyPfannschmidt
Copy link
Member

its clearly a "passed" test when being non-strict

@hackebrot
Copy link
Member Author

I don't think it should be skipped, but that looks to be what is reported on master, see https://github.com/pytest-dev/pytest/blob/master/testing/test_junitxml.py#L103

@RonnyPfannschmidt
Copy link
Member

that would be a bug then ^^

@hackebrot
Copy link
Member Author

Okay, cool 😄 so to conclude:

Non-strict mode (default)

  • xfail > test fails > xfailed (passed)
  • xfail > test passes > xpass (passed)

Strict mode

  • xfail(strict=True) > test fails > xfailed (passed)
  • xfail(strict=True) > test passes > failed (failed)

Everyone happy?

@nicoddemus
Copy link
Member

I don't think it should be skipped, but that looks to be what is reported on master, see

That's what I meant by:

They broke because we have changed the semantics slightly: "xpass" for non-strict xfail now became a "passing test", instead of a "skipped test".

😁

Okay, cool 😄 so to conclude:
(snip table)

On that table, what does > xfailed (passed) mean exactly? That report.outcome == "passed" (or IOW report.passed is True)? Just to clarify.

@hackebrot
Copy link
Member Author

Test Code

# test_nope.py

import pytest

@pytest.mark.xfail
def test_fail_non_strict():
    assert False

@pytest.mark.xfail
def test_pass_non_strict():
    assert True

@pytest.mark.xfail(strict=True)
def test_fail_strict():
    assert False

@pytest.mark.xfail(strict=True)
def test_pass_strict():
    assert True
# conftest.py

def pytest_runtest_logreport(report):
    print()
    print("report.nodeid   '{}'".format(report.nodeid))
    print("report.when     '{}'".format(report.when))
    print("report.outcome  '{}'".format(report.outcome))
    print("report.wasxfail '{}'".format(getattr(report, 'wasxfail', 'NOT SET')))
    print()

@hackebrot
Copy link
Member Author

master

============================= test session starts ==============================
collected 4 items

test_nope.py 
report.nodeid   'test_nope.py::test_fail_non_strict'
report.when     'setup'
report.outcome  'passed'
report.wasxfail 'NOT SET'

x
report.nodeid   'test_nope.py::test_fail_non_strict'
report.when     'call'
report.outcome  'skipped'
report.wasxfail ''


report.nodeid   'test_nope.py::test_fail_non_strict'
report.when     'teardown'
report.outcome  'passed'
report.wasxfail 'NOT SET'


report.nodeid   'test_nope.py::test_pass_non_strict'
report.when     'setup'
report.outcome  'passed'
report.wasxfail 'NOT SET'

X
report.nodeid   'test_nope.py::test_pass_non_strict'
report.when     'call'
report.outcome  'failed'
report.wasxfail ''


report.nodeid   'test_nope.py::test_pass_non_strict'
report.when     'teardown'
report.outcome  'passed'
report.wasxfail 'NOT SET'


report.nodeid   'test_nope.py::test_fail_strict'
report.when     'setup'
report.outcome  'passed'
report.wasxfail 'NOT SET'

x
report.nodeid   'test_nope.py::test_fail_strict'
report.when     'call'
report.outcome  'skipped'
report.wasxfail ''


report.nodeid   'test_nope.py::test_fail_strict'
report.when     'teardown'
report.outcome  'passed'
report.wasxfail 'NOT SET'


report.nodeid   'test_nope.py::test_pass_strict'
report.when     'setup'
report.outcome  'passed'
report.wasxfail 'NOT SET'

F
report.nodeid   'test_nope.py::test_pass_strict'
report.when     'call'
report.outcome  'failed'
report.wasxfail 'NOT SET'


report.nodeid   'test_nope.py::test_pass_strict'
report.when     'teardown'
report.outcome  'passed'
report.wasxfail 'NOT SET'


=========================== short test summary info ============================
XFAIL test_nope.py::test_fail_non_strict
XFAIL test_nope.py::test_fail_strict
XPASS test_nope.py::test_pass_non_strict 

=================================== FAILURES ===================================
_______________________________ test_pass_strict _______________________________
[XPASS(strict)] 
================ 1 failed, 2 xfailed, 1 xpassed in 0.01 seconds ================

@nicoddemus
Copy link
Member

Here's the output using @hackebrot's branch (only for when == call):

============================= test session starts =============================
platform win32 -- Python 3.5.0, pytest-2.9.3.dev0, py-1.4.31, pluggy-0.3.1
rootdir: X:\pytest, inifile: tox.ini
plugins: hypothesis-3.1.0, xdist-1.14
collected 4 items

tmp\test_nope.py x
report.nodeid   'tmp/test_nope.py::test_fail_non_strict'
report.outcome  'skipped'
report.wasxfail ''

X
report.nodeid   'tmp/test_nope.py::test_pass_non_strict'
report.outcome  'passed'
report.wasxfail ''

x
report.nodeid   'tmp/test_nope.py::test_fail_strict'
report.outcome  'skipped'
report.wasxfail ''

F
report.nodeid   'tmp/test_nope.py::test_pass_strict'
report.outcome  'failed'
report.wasxfail 'NOT SET'

=========================== short test summary info ===========================
XFAIL tmp/test_nope.py::test_fail_non_strict
XFAIL tmp/test_nope.py::test_fail_strict
XPASS tmp/test_nope.py::test_pass_non_strict
================================== FAILURES ===================================
______________________________ test_pass_strict _______________________________
[XPASS(strict)]
=============== 1 failed, 2 xfailed, 1 xpassed in 0.02 seconds ================

And here's the junit XML file:

<?xml version="1.0" encoding="utf-8"?>
<testsuite errors="0" failures="1" name="pytest" skips="2" tests="4" time="0.020">
    <testcase classname="tmp.test_nope" file="tmp\test_nope.py" line="2" name="test_fail_non_strict" time="0.0">
        <skipped message="expected test failure"></skipped>
    </testcase>
    <testcase classname="tmp.test_nope" file="tmp\test_nope.py" line="6" name="test_pass_non_strict" time="0.0"></testcase>
    <testcase classname="tmp.test_nope" file="tmp\test_nope.py" line="10" name="test_fail_strict" time="0.0">
        <skipped message="expected test failure"></skipped>
    </testcase>
    <testcase classname="tmp.test_nope" file="tmp\test_nope.py" line="14" name="test_pass_strict" time="0.0">
        <failure message="[XPASS(strict)] ">[XPASS(strict)]</failure>
    </testcase>
</testsuite>

Parsing the XML file, we can see:

  • test_fail_non_strict: SKIPPED
  • test_pass_non_strict: PASSED
  • test_fail_strict: SKIPPED
  • test_pass_strict: FAILED

IMHO the XML now looks better because a non-strict xpass now reports PASSED, so the tests in test_junitxml.py should be updated to reflect this.

For comparison, here's the same summary for the master branch:

  • test_fail_non_strict: SKIPPED
  • test_pass_non_strict: SKIPPED
  • test_fail_strict: SKIPPED
  • test_pass_strict: FAILED

@The-Compiler
Copy link
Member

That sounds right to me - so what's left to do here? I'm guessing @hackebrot needs to adjust the existing tests and then this is ready for review?

@hackebrot
Copy link
Member Author

I find it slightly confusing thatxfail results in SKIPPED. The test item is being run, so I don't quite see how this should result in skipped.

I'd much rather expect the following:

  • test_fail_non_strict: PASSED
  • test_pass_non_strict: PASSED
  • test_fail_strict: PASSED
  • test_pass_strict: FAILED

@The-Compiler
Copy link
Member

@hackebrot Hmm, you do have a point there. On a second thought, I agree (except with xfail(run=False) I guess...)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants