Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Show "short test summary info" after tracebacks and warnings #3255

Merged
merged 2 commits into from
Feb 27, 2018

Conversation

nicoddemus
Copy link
Member

Currently when using -ra the "short test summary info" is shown before the full list of tracebacks and errors:

.tmp\test_fail.py::test FAILED                       [ 50%]
.tmp\test_fail.py::test_s SKIPPED                    [100%]
================= short test summary info =================
FAIL .tmp\test_fail.py::test
SKIP [1] C:\pytest\.tmp\test_fail.py:8: skipping

======================== FAILURES =========================
__________________________ test ___________________________

    def test():
>       assert 0
E       assert 0

.tmp\test_fail.py:4: AssertionError
=========== 1 failed, 1 skipped in 0.04 seconds ===========

When you have a large test suite and a lot of tests fail, the short summary test info easily gets lost amidst the huge number of lines printed by all the tracebacks.

I tracked this to a change introduced in #1305 (by yours truly) which was meant to fix the warnings count, but have this unfortunate side effect.

With the change the output now becomes:

.tmp\test_fail.py::test FAILED                       [ 50%]
.tmp\test_fail.py::test_s SKIPPED                    [100%]

======================== FAILURES =========================
__________________________ test ___________________________

    def test():
>       assert 0
E       assert 0

.tmp\test_fail.py:4: AssertionError
================= short test summary info =================
FAIL .tmp\test_fail.py::test
SKIP [1] C:\pytest\.tmp\test_fail.py:8: skipping
=========== 1 failed, 1 skipped in 0.05 seconds ===========

In order to avoid breakages and further surprises I thought better to create a new hook because changing the place where pytest_terminal_summary is called yet again would be worse I think.

nicoddemus added a commit to nicoddemus/pytest that referenced this pull request Feb 22, 2018
@coveralls
Copy link

coveralls commented Feb 22, 2018

Coverage Status

Coverage increased (+0.05%) to 92.586% when pulling 94050a8 on nicoddemus:post-summary into d196ab4 on pytest-dev:features.

Copy link
Member

@RonnyPfannschmidt RonnyPfannschmidt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i am under the impression this needlessly introduces a new hook, when hook ordering and moving the terminal plug-ins own summary into said hook is sufficient

tryfirst/trylast should match it nicely already

@nicoddemus
Copy link
Member Author

i am under the impression this needlessly introduces a new hook, when hook ordering and moving the terminal plug-ins own summary into said hook is sufficient

Good point, I was also under that impression but thought opening a PR first to invite comments. BUT if we follow this approach I think we will re-introduce #1305 no?

@nicoddemus
Copy link
Member Author

@ionelmc what do you think? I'm under the impression that if we follow @RonnyPfannschmidt your code which initially detected this problem will keep working as expected, because your third party hook will be called first by default.

@ionelmc
Copy link
Member

ionelmc commented Feb 26, 2018

@nicoddemus dunno which code or hook you're thinking about but I can say:

  • I would want the list of failed test names and warning to appear at the very end
  • I usually use --tb=native in my projects

@nicoddemus
Copy link
Member Author

Sorry @ionelmc I meant regarding #1305. 😁

@ionelmc
Copy link
Member

ionelmc commented Feb 26, 2018

I thought the ordering problem was fixed? If it regressed and this PR fixes it again 👍 from me 😁

@ionelmc
Copy link
Member

ionelmc commented Feb 26, 2018

Ah, well pytest-benchmark is using pytest_terminal_summary, so I guess this PR is fine? I can give it a test if you want.

@nicoddemus
Copy link
Member Author

The original problem was that pytest-benchmark implemented pytest_terminal_summary and issued warnings there, but because of how it was originally implemented those warnings were not displayed:

            self.summary_errors()
            self.summary_failures()
            self.summary_warnings()  # < warning count captured and shown here
            self.config.hook.pytest_terminal_summary(terminalreporter=self)  # <- pytest-benchmark hook called at this point

@RonnyPfannschmidt suggests that terminal itself implements pytest_terminal_summary and we can use tryfirst/trylast to push the summary produced to the end of the output:

            
            self.config.hook.pytest_terminal_summary(terminalreporter=self)

    def pytest_terminal_summary():
        self.summary_errors()
        self.summary_failures()
        self.summary_warnings()

I'm pretty sure this will work and not introduce a regression as pytest-benchmark's hook will be called first because it is a 3rd party plugin.

@nicoddemus
Copy link
Member Author

@RonnyPfannschmidt implemented your suggestion. 👍

@@ -480,16 +480,21 @@ def pytest_sessionfinish(self, exitstatus):
EXIT_NOTESTSCOLLECTED)
if exitstatus in summary_exit_codes:
self.config.hook.pytest_terminal_summary(terminalreporter=self,
config=self.config,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

dito

@@ -489,8 +489,16 @@ def pytest_report_teststatus(report):
Stops at first non-None result, see :ref:`firstresult` """


def pytest_terminal_summary(terminalreporter, exitstatus):
""" add additional section in terminal summary reporting. """
def pytest_terminal_summary(config, terminalreporter, exitstatus):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is a breaking change, any caller of the hook would need to add the extra parameter

this needs to be addressed in pluggy first

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This hook is called only by terminal.py, and we've added parameters to hooks in the past, it's not clear to me why this hook in particular cannot receive an extra argument at this point.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

in this particular hooks its probably less of an issue, however all hooks and the required arguments they take are currently part of the public api

for example pytest_deselected is in dire need of a added reason argument however its one of the hooks that are supposed to be called by 3rd parties, and they would simply break

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see what you mean regarding pytest_deselected, but AFAIK this is the only hook where we expect to be called by 3rd parties, all other hooks are called by pytest itself.

So IMHO your concern is valid but not for most hooks.

Having said all that, if you really want I can remove the parameter, it was not really required by the hook implementations I touched.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

im leaning torwards removing it, but as you said, its unlikely to be invoked by 3rd parties as of now
we ought to create a mechanism trough which 3rd parties defer to hooks later on

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No worries, removed it. 👍

@RonnyPfannschmidt
Copy link
Member

thanks ^^

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants