Add documentation for marks

This commit is contained in:
Bruno Oliveira 2018-02-28 20:34:20 -03:00
parent ed8b1efc16
commit 988ace9eb2
6 changed files with 137 additions and 34 deletions

View File

@ -116,6 +116,21 @@ class MarkerError(Exception):
def param(*values, **kw):
"""Specify a parameter in a `pytest.mark.parametrize`_ call.
.. code-block:: python
@pytest.mark.parametrize("test_input,expected", [
("3+5", 8),
pytest.param("6*9", 42, marks=pytest.mark.xfail),
])
def test_eval(test_input, expected):
assert eval(test_input) == expected
:param values: variable args of the values of the parameter set, in order.
:keyword marks: a single mark or a list of marks to be applied to this parameter set.
:keyword str id: the id to attribute to this parameter set.
"""
return ParameterSet.param(*values, **kw)

View File

@ -717,7 +717,7 @@ class CallSpec2(object):
class Metafunc(fixtures.FuncargnamesCompatAttr):
"""
Metafunc objects are passed to the ``pytest_generate_tests`` hook.
Metafunc objects are passed to the :func:`pytest_generate_tests <_pytest.hookspec.pytest_generate_tests>` hook.
They help to inspect a test function and to generate tests according to
test configuration or values specified in the class or module where a
test function is defined.

View File

@ -4,7 +4,6 @@
Marking test functions with attributes
=================================================================
.. currentmodule:: _pytest.mark
By using the ``pytest.mark`` helper you can easily set
metadata on your test functions. There are
@ -27,15 +26,3 @@ which also serve as documentation.
:ref:`fixtures <fixtures>`.
API reference for mark related objects
------------------------------------------------
.. autoclass:: MarkGenerator
:members:
.. autoclass:: MarkDecorator
:members:
.. autoclass:: MarkInfo
:members:

View File

@ -33,7 +33,7 @@ pytest enables test parametrization at several levels:
.. versionchanged:: 2.4
Several improvements.
The builtin ``pytest.mark.parametrize`` decorator enables
The builtin :ref:`pytest.mark.parametrize ref` decorator enables
parametrization of arguments for a test function. Here is a typical example
of a test function that implements checking that a certain input leads
to an expected output::
@ -206,12 +206,3 @@ More examples
For further examples, you might want to look at :ref:`more
parametrization examples <paramexamples>`.
.. _`metafunc object`:
The **metafunc** object
-------------------------------------------
.. currentmodule:: _pytest.python
.. autoclass:: Metafunc
:members:

View File

@ -51,6 +51,11 @@ pytest.main
.. autofunction:: _pytest.config.main
pytest.param
~~~~~~~~~~~~
.. autofunction:: _pytest.mark.param
pytest.raises
-------------
@ -246,6 +251,10 @@ Full reference to objects accessible from :ref:`fixtures <fixture>` or hooks
:members:
:inherited-members:
.. currentmodule:: _pytest.python
.. autoclass:: Metafunc
:members:
.. autoclass:: pluggy._Result
:members:
@ -259,9 +268,114 @@ Full reference to objects accessible from :ref:`fixtures <fixture>` or hooks
.. autoclass:: pluggy.PluginManager()
:members:
.. autoclass:: pluggy.PluginManager()
.. currentmodule:: _pytest.mark
.. autoclass:: MarkGenerator
:members:
.. autoclass:: MarkDecorator
:members:
.. autoclass:: MarkInfo
:members:
Marks
-----
Marks can be used apply meta data to *test functions* (but not fixtures), which can then be accessed by
fixtures or plugins.
.. _`pytest.mark.parametrize ref`:
pytest.mark.parametrize
~~~~~~~~~~~~~~~~~~~~~~~
**Tutorial**: :doc:`parametrize`.
.. automethod:: _pytest.python.Metafunc.parametrize
.. _`pytest.mark.skip ref`:
pytest.mark.skip
~~~~~~~~~~~~~~~~
**Tutorial**: :ref:`skip`.
Unconditionally skip a test function.
.. py:function:: pytest.mark.skip(*, reason=None)
:keyword str reason: Reason why the test function is being skipped.
.. _`pytest.mark.skipif ref`:
pytest.mark.skipif
~~~~~~~~~~~~~~~~~~
**Tutorial**: :ref:`xfail`.
Skip a test function if a condition is ``True``.
.. py:function:: pytest.mark.skipif(condition, *, reason=None)
:type condition: bool or str
:param condition: ``True/False`` if the condition should be skipped or a :ref:`condition string <string conditions>`.
:keyword str reason: Reason why the test function is being skipped.
.. _`pytest.mark.xfail ref`:
pytest.mark.xfail
~~~~~~~~~~~~~~~~~~
**Tutorial**: :ref:`xfail`.
Marks a test function as *expected to fail*.
.. py:function:: pytest.mark.xfail(condition=None, *, reason=None, raises=None, run=True, strict=False)
:type condition: bool or str
:param condition: ``True/False`` if the condition should be marked as xfail or a :ref:`condition string <string conditions>`.
:keyword str reason: Reason why the test function is marked as xfail.
:keyword Exception raises: Exception subclass expected to be raised by the test function; other exceptions will fail the test.
:keyword bool run:
If the test function should actually be executed. If ``False``, the function will always xfail and will
not be executed (useful a function is segfaulting).
:keyword bool strict:
* If ``False`` (the default) the function will be shown in the terminal output as ``xfailed`` if it fails
and as ``xpass`` if it passes. In both cases this will not cause the test suite to fail as a whole. This
is particularly useful to mark *flaky* tests (tests that random at fail) to be tackled later.
* If ``True``, the function will be shown in the terminal output as ``xfailed`` if it fails, but if it
unexpectedly passes then it will **fail** the test suite. This is particularly useful to mark functions
that are always failing and there should be a clear indication if they unexpectedly start to pass (for example
a new release of a library fixes a known bug).
custom marks
~~~~~~~~~~~~
Marks are created dynamically using the factory object ``pytest.mark`` and applied as a decorator.
For example:
.. code-block:: python
@pytest.mark.timeout(10, 'slow', method='thread')
def test_function():
...
Will create and attach a :class:`MarkInfo <_pytest.mark.MarkInfo>` object to the collected
:class:`Item <_pytest.nodes.Item>`, which can then be accessed by fixtures or hooks with
:meth:`Node.get_marker <_pytest.nodes.Node.get_marker>`. The ``mark`` object will have the following attributes:
.. code-block:: python
mark.args == (10, 'slow')
mark.kwargs == {'method': 'thread'}
Fixtures

View File

@ -71,6 +71,8 @@ It is also possible to skip the whole module using
The imperative method is useful when it is not possible to evaluate the skip condition
during import time.
**Reference**: :ref:`pytest.mark.skip ref`
``skipif``
~~~~~~~~~~
@ -116,6 +118,8 @@ Alternatively, you can use :ref:`condition strings
<string conditions>` instead of booleans, but they can't be shared between modules easily
so they are supported mainly for backward compatibility reasons.
**Reference**: :ref:`pytest.mark.skipif ref`
Skip all test functions of a class or module
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
@ -232,15 +236,7 @@ This will unconditionally make ``test_function`` ``XFAIL``. Note that no other c
after ``pytest.xfail`` call, differently from the marker. That's because it is implemented
internally by raising a known exception.
Here's the signature of the ``xfail`` **marker** (not the function), using Python 3 keyword-only
arguments syntax:
.. code-block:: python
def xfail(condition=None, *, reason=None, raises=None, run=True, strict=False):
**Reference**: :ref:`pytest.mark.xfail ref`
``strict`` parameter
~~~~~~~~~~~~~~~~~~~~