test_ok1/doc/skipping.txt

198 lines
6.3 KiB
Plaintext
Raw Normal View History

.. _`skip and xfail`:
skip and xfail mechanisms
=====================================================================
You can skip or "xfail" test functions, either by marking functions
through a decorator or by calling the ``pytest.skip|xfail`` functions.
A *skip* means that you expect your test to pass unless a certain configuration or condition (e.g. wrong Python interpreter, missing dependency) prevents it to run. And *xfail* means that you expect your test to fail because there is an
implementation problem. py.test counts and lists *xfailing* tests separately
and you can provide info such as a bug number or a URL to provide a
human readable problem context.
Usually detailed information about skipped/xfailed tests is not shown
at the end of a test run to avoid cluttering the output. You can use
the ``-r`` option to see details corresponding to the "short" letters
shown in the test progress::
py.test -rxs # show extra info on skips and xfail tests
(See :ref:`how to change command line options defaults`)
.. _skipif:
Skipping a single function
-------------------------------------------
Here is an example for marking a test function to be skipped
when run on a Python3 interpreter::
import sys
@pytest.mark.skipif("sys.version_info >= (3,0)")
def test_function():
...
During test function setup the skipif condition is
evaluated by calling ``eval(expr, namespace)``. The namespace
contains all the module globals of the test function so that
you can for example check for versions::
import mymodule
@pytest.mark.skipif("mymodule.__version__ < '1.2'")
def test_function():
...
The test function will be skipped and not run if
mymodule is below the specified version. The reason
for specifying the condition as a string is mainly that
you can see more detailed reporting of xfail/skip reasons.
Actually, the namespace is first initialized by
putting the ``sys`` and ``os`` modules and the test
``config`` object into it. And is then updated with
the module globals. The latter allows you to skip based
on a test configuration value::
@pytest.mark.skipif("not config.getvalue('db')")
def test_function(...):
...
Create a shortcut for your conditional skip decorator
at module level like this::
win32only = pytest.mark.skipif("sys.platform != 'win32'")
@win32only
def test_function():
...
skip all test functions of a class
--------------------------------------
As with all function :ref:`marking` you can do it at
`whole class- or module level`_. Here is an example
for skipping all methods of a test class based on platform::
class TestPosixCalls:
pytestmark = pytest.mark.skipif("sys.platform == 'win32'")
def test_function(self):
"will not be setup or run under 'win32' platform"
The ``pytestmark`` decorator will be applied to each test function.
If your code targets python2.6 or above you can equivalently use
the skipif decorator on classes::
@pytest.mark.skipif("sys.platform == 'win32'")
class TestPosixCalls:
def test_function(self):
"will not be setup or run under 'win32' platform"
It is fine in general to apply multiple "skipif" decorators
on a single function - this means that if any of the conditions
apply the function will be skipped.
.. _`whole class- or module level`: mark.html#scoped-marking
.. _xfail:
mark a test function as expected to fail
-------------------------------------------------------
You can use the ``xfail`` marker to indicate that you
expect the test to fail::
@pytest.mark.xfail
def test_function():
...
This test will be run but no traceback will be reported
when it fails. Instead terminal reporting will list it in the
"expected to fail" or "unexpectedly passing" sections.
By specifying on the commandline::
pytest --runxfail
you can force the running and reporting of an ``xfail`` marked test
as if it weren't marked at all.
Same as with skipif_ you can also selectively expect a failure
depending on platform::
@pytest.mark.xfail("sys.version_info >= (3,0)")
def test_function():
...
You can also avoid running an "xfail" test at all or
specify a reason such as a bug ID or similar. Here is
a simple test file with usages:
.. literalinclude:: example/xfail_demo.py
Running it with the report-on-xfail option gives this output::
example $ py.test -rx xfail_demo.py
=========================== test session starts ============================
platform linux2 -- Python 2.6.6 -- pytest-2.0.1
2010-11-26 20:26:56 +08:00
collecting ... collected 5 items
xfail_demo.py xxxxx
========================= short test summary info ==========================
XFAIL xfail_demo.py::test_hello
XFAIL xfail_demo.py::test_hello2
reason: [NOTRUN]
XFAIL xfail_demo.py::test_hello3
condition: hasattr(os, 'sep')
XFAIL xfail_demo.py::test_hello4
bug 110
XFAIL xfail_demo.py::test_hello5
reason: reason
2011-02-07 18:45:37 +08:00
======================== 5 xfailed in 0.04 seconds =========================
imperative xfail from within a test or setup function
------------------------------------------------------
If you cannot declare xfail-conditions at import time
you can also imperatively produce an XFail-outcome from
within test or setup code. Example::
def test_function():
if not valid_config():
pytest.xfail("unsupported configuration")
skipping on a missing import dependency
--------------------------------------------------
You can use the following import helper at module level
or within a test or test setup function::
docutils = pytest.importorskip("docutils")
If ``docutils`` cannot be imported here, this will lead to a
skip outcome of the test. You can also skip depending if
if a library does not come with a high enough version::
docutils = pytest.importorskip("docutils", minversion="0.3")
The version will be read from the specified module's ``__version__`` attribute.
imperative skip from within a test or setup function
------------------------------------------------------
If for some reason you cannot declare skip-conditions
you can also imperatively produce a Skip-outcome from
within test or setup code. Example::
def test_function():
if not valid_config():
pytest.skip("unsupported configuration")