2017-07-07 18:08:12 +08:00
|
|
|
.. _`cache_provider`:
|
2017-07-20 08:11:17 +08:00
|
|
|
.. _cache:
|
|
|
|
|
2017-07-07 18:08:12 +08:00
|
|
|
|
2015-09-26 12:09:50 +08:00
|
|
|
Cache: working with cross-testrun state
|
2015-07-11 16:13:27 +08:00
|
|
|
=======================================
|
|
|
|
|
2015-08-18 01:17:39 +08:00
|
|
|
.. versionadded:: 2.8
|
|
|
|
|
2015-07-11 16:13:27 +08:00
|
|
|
Usage
|
|
|
|
---------
|
|
|
|
|
2015-09-17 02:41:22 +08:00
|
|
|
The plugin provides two command line options to rerun failures from the
|
2016-06-21 22:16:57 +08:00
|
|
|
last ``pytest`` invocation:
|
2015-07-11 16:13:27 +08:00
|
|
|
|
2016-01-28 05:57:11 +08:00
|
|
|
* ``--lf``, ``--last-failed`` - to only re-run the failures.
|
|
|
|
* ``--ff``, ``--failed-first`` - to run the failures first and then the rest of
|
2015-07-11 16:13:27 +08:00
|
|
|
the tests.
|
|
|
|
|
2015-09-17 02:41:22 +08:00
|
|
|
For cleanup (usually not needed), a ``--cache-clear`` option allows to remove
|
2015-07-11 16:13:27 +08:00
|
|
|
all cross-session cache contents ahead of a test run.
|
|
|
|
|
2018-05-18 16:19:46 +08:00
|
|
|
Other plugins may access the `config.cache`_ object to set/get
|
2016-06-21 22:16:57 +08:00
|
|
|
**json encodable** values between ``pytest`` invocations.
|
2015-09-17 02:41:22 +08:00
|
|
|
|
2015-09-29 05:23:08 +08:00
|
|
|
.. note::
|
|
|
|
|
|
|
|
This plugin is enabled by default, but can be disabled if needed: see
|
|
|
|
:ref:`cmdunregister` (the internal name for this plugin is
|
|
|
|
``cacheprovider``).
|
|
|
|
|
2015-07-11 16:13:27 +08:00
|
|
|
|
|
|
|
Rerunning only failures or failures first
|
|
|
|
-----------------------------------------------
|
|
|
|
|
|
|
|
First, let's create 50 test invocation of which only 2 fail::
|
|
|
|
|
|
|
|
# content of test_50.py
|
|
|
|
import pytest
|
|
|
|
|
|
|
|
@pytest.mark.parametrize("i", range(50))
|
|
|
|
def test_num(i):
|
|
|
|
if i in (17, 25):
|
|
|
|
pytest.fail("bad luck")
|
|
|
|
|
|
|
|
If you run this for the first time you will see two failures::
|
|
|
|
|
2016-06-21 22:16:57 +08:00
|
|
|
$ pytest -q
|
2017-11-23 23:33:41 +08:00
|
|
|
.................F.......F........................ [100%]
|
|
|
|
================================= FAILURES =================================
|
|
|
|
_______________________________ test_num[17] _______________________________
|
2018-05-18 16:19:46 +08:00
|
|
|
|
2015-07-11 16:13:27 +08:00
|
|
|
i = 17
|
2018-05-18 16:19:46 +08:00
|
|
|
|
2015-07-11 16:13:27 +08:00
|
|
|
@pytest.mark.parametrize("i", range(50))
|
|
|
|
def test_num(i):
|
2015-09-17 02:41:22 +08:00
|
|
|
if i in (17, 25):
|
2015-07-11 16:13:27 +08:00
|
|
|
> pytest.fail("bad luck")
|
|
|
|
E Failed: bad luck
|
2018-05-18 16:19:46 +08:00
|
|
|
|
2015-07-11 16:13:27 +08:00
|
|
|
test_50.py:6: Failed
|
2017-11-23 23:33:41 +08:00
|
|
|
_______________________________ test_num[25] _______________________________
|
2018-05-18 16:19:46 +08:00
|
|
|
|
2015-07-11 16:13:27 +08:00
|
|
|
i = 25
|
2018-05-18 16:19:46 +08:00
|
|
|
|
2015-07-11 16:13:27 +08:00
|
|
|
@pytest.mark.parametrize("i", range(50))
|
|
|
|
def test_num(i):
|
2015-09-17 02:41:22 +08:00
|
|
|
if i in (17, 25):
|
2015-07-11 16:13:27 +08:00
|
|
|
> pytest.fail("bad luck")
|
|
|
|
E Failed: bad luck
|
2018-05-18 16:19:46 +08:00
|
|
|
|
2015-07-11 16:13:27 +08:00
|
|
|
test_50.py:6: Failed
|
2015-09-22 20:02:11 +08:00
|
|
|
2 failed, 48 passed in 0.12 seconds
|
2015-07-11 16:13:27 +08:00
|
|
|
|
2015-09-17 02:44:41 +08:00
|
|
|
If you then run it with ``--lf``::
|
2015-07-11 16:13:27 +08:00
|
|
|
|
2016-06-21 22:16:57 +08:00
|
|
|
$ pytest --lf
|
2017-11-23 23:33:41 +08:00
|
|
|
=========================== test session starts ============================
|
2017-05-13 04:17:40 +08:00
|
|
|
platform linux -- Python 3.x.y, pytest-3.x.y, py-1.x.y, pluggy-0.x.y
|
2017-03-14 06:41:20 +08:00
|
|
|
rootdir: $REGENDOC_TMPDIR, inifile:
|
2018-03-22 04:46:07 +08:00
|
|
|
collected 50 items / 48 deselected
|
2017-07-31 05:37:18 +08:00
|
|
|
run-last-failure: rerun previous 2 failures
|
2018-05-18 16:19:46 +08:00
|
|
|
|
2017-11-23 23:33:41 +08:00
|
|
|
test_50.py FF [100%]
|
2018-05-18 16:19:46 +08:00
|
|
|
|
2017-11-23 23:33:41 +08:00
|
|
|
================================= FAILURES =================================
|
|
|
|
_______________________________ test_num[17] _______________________________
|
2018-05-18 16:19:46 +08:00
|
|
|
|
2015-07-11 16:13:27 +08:00
|
|
|
i = 17
|
2018-05-18 16:19:46 +08:00
|
|
|
|
2015-07-11 16:13:27 +08:00
|
|
|
@pytest.mark.parametrize("i", range(50))
|
|
|
|
def test_num(i):
|
2015-09-17 02:41:22 +08:00
|
|
|
if i in (17, 25):
|
2015-07-11 16:13:27 +08:00
|
|
|
> pytest.fail("bad luck")
|
|
|
|
E Failed: bad luck
|
2018-05-18 16:19:46 +08:00
|
|
|
|
2015-07-11 16:13:27 +08:00
|
|
|
test_50.py:6: Failed
|
2017-11-23 23:33:41 +08:00
|
|
|
_______________________________ test_num[25] _______________________________
|
2018-05-18 16:19:46 +08:00
|
|
|
|
2015-07-11 16:13:27 +08:00
|
|
|
i = 25
|
2018-05-18 16:19:46 +08:00
|
|
|
|
2015-07-11 16:13:27 +08:00
|
|
|
@pytest.mark.parametrize("i", range(50))
|
|
|
|
def test_num(i):
|
2015-09-17 02:41:22 +08:00
|
|
|
if i in (17, 25):
|
2015-07-11 16:13:27 +08:00
|
|
|
> pytest.fail("bad luck")
|
|
|
|
E Failed: bad luck
|
2018-05-18 16:19:46 +08:00
|
|
|
|
2015-07-11 16:13:27 +08:00
|
|
|
test_50.py:6: Failed
|
2017-11-23 23:33:41 +08:00
|
|
|
================= 2 failed, 48 deselected in 0.12 seconds ==================
|
2015-07-11 16:13:27 +08:00
|
|
|
|
2015-09-17 02:44:41 +08:00
|
|
|
You have run only the two failing test from the last run, while 48 tests have
|
|
|
|
not been run ("deselected").
|
2015-07-11 16:13:27 +08:00
|
|
|
|
2015-09-17 02:44:41 +08:00
|
|
|
Now, if you run with the ``--ff`` option, all tests will be run but the first
|
|
|
|
previous failures will be executed first (as can be seen from the series
|
|
|
|
of ``FF`` and dots)::
|
2015-07-11 16:13:27 +08:00
|
|
|
|
2016-06-21 22:16:57 +08:00
|
|
|
$ pytest --ff
|
2017-11-23 23:33:41 +08:00
|
|
|
=========================== test session starts ============================
|
2017-05-13 04:17:40 +08:00
|
|
|
platform linux -- Python 3.x.y, pytest-3.x.y, py-1.x.y, pluggy-0.x.y
|
2017-03-14 06:41:20 +08:00
|
|
|
rootdir: $REGENDOC_TMPDIR, inifile:
|
2015-07-11 16:13:27 +08:00
|
|
|
collected 50 items
|
2017-07-31 05:37:18 +08:00
|
|
|
run-last-failure: rerun previous 2 failures first
|
2018-05-18 16:19:46 +08:00
|
|
|
|
2017-11-23 23:33:41 +08:00
|
|
|
test_50.py FF................................................ [100%]
|
2018-05-18 16:19:46 +08:00
|
|
|
|
2017-11-23 23:33:41 +08:00
|
|
|
================================= FAILURES =================================
|
|
|
|
_______________________________ test_num[17] _______________________________
|
2018-05-18 16:19:46 +08:00
|
|
|
|
2015-07-11 16:13:27 +08:00
|
|
|
i = 17
|
2018-05-18 16:19:46 +08:00
|
|
|
|
2015-07-11 16:13:27 +08:00
|
|
|
@pytest.mark.parametrize("i", range(50))
|
|
|
|
def test_num(i):
|
2015-09-17 02:41:22 +08:00
|
|
|
if i in (17, 25):
|
2015-07-11 16:13:27 +08:00
|
|
|
> pytest.fail("bad luck")
|
|
|
|
E Failed: bad luck
|
2018-05-18 16:19:46 +08:00
|
|
|
|
2015-07-11 16:13:27 +08:00
|
|
|
test_50.py:6: Failed
|
2017-11-23 23:33:41 +08:00
|
|
|
_______________________________ test_num[25] _______________________________
|
2018-05-18 16:19:46 +08:00
|
|
|
|
2015-07-11 16:13:27 +08:00
|
|
|
i = 25
|
2018-05-18 16:19:46 +08:00
|
|
|
|
2015-07-11 16:13:27 +08:00
|
|
|
@pytest.mark.parametrize("i", range(50))
|
|
|
|
def test_num(i):
|
2015-09-17 02:41:22 +08:00
|
|
|
if i in (17, 25):
|
2015-07-11 16:13:27 +08:00
|
|
|
> pytest.fail("bad luck")
|
|
|
|
E Failed: bad luck
|
2018-05-18 16:19:46 +08:00
|
|
|
|
2015-07-11 16:13:27 +08:00
|
|
|
test_50.py:6: Failed
|
2017-11-23 23:33:41 +08:00
|
|
|
=================== 2 failed, 48 passed in 0.12 seconds ====================
|
2015-07-11 16:13:27 +08:00
|
|
|
|
|
|
|
.. _`config.cache`:
|
|
|
|
|
2018-02-24 03:49:17 +08:00
|
|
|
New ``--nf``, ``--new-first`` options: run new tests first followed by the rest
|
|
|
|
of the tests, in both cases tests are also sorted by the file modified time,
|
|
|
|
with more recent files coming first.
|
|
|
|
|
2018-03-11 04:45:45 +08:00
|
|
|
Behavior when no tests failed in the last run
|
|
|
|
---------------------------------------------
|
|
|
|
|
|
|
|
When no tests failed in the last run, or when no cached ``lastfailed`` data was
|
|
|
|
found, ``pytest`` can be configured either to run all of the tests or no tests,
|
|
|
|
using the ``--last-failed-no-failures`` option, which takes one of the following values::
|
|
|
|
|
|
|
|
pytest --last-failed-no-failures all # run all tests (default behavior)
|
|
|
|
pytest --last-failed-no-failures none # run no tests and exit
|
|
|
|
|
2015-07-11 16:13:27 +08:00
|
|
|
The new config.cache object
|
|
|
|
--------------------------------
|
|
|
|
|
|
|
|
.. regendoc:wipe
|
|
|
|
|
2015-09-17 03:06:44 +08:00
|
|
|
Plugins or conftest.py support code can get a cached value using the
|
|
|
|
pytest ``config`` object. Here is a basic example plugin which
|
|
|
|
implements a :ref:`fixture` which re-uses previously created state
|
2016-06-21 22:16:57 +08:00
|
|
|
across pytest invocations::
|
2015-07-11 16:13:27 +08:00
|
|
|
|
|
|
|
# content of test_caching.py
|
2015-09-17 03:06:44 +08:00
|
|
|
import pytest
|
2015-07-11 16:13:27 +08:00
|
|
|
import time
|
|
|
|
|
2015-09-17 03:06:44 +08:00
|
|
|
@pytest.fixture
|
|
|
|
def mydata(request):
|
2015-07-11 16:13:27 +08:00
|
|
|
val = request.config.cache.get("example/value", None)
|
|
|
|
if val is None:
|
|
|
|
time.sleep(9*0.6) # expensive computation :)
|
|
|
|
val = 42
|
|
|
|
request.config.cache.set("example/value", val)
|
|
|
|
return val
|
|
|
|
|
|
|
|
def test_function(mydata):
|
|
|
|
assert mydata == 23
|
|
|
|
|
|
|
|
If you run this command once, it will take a while because
|
|
|
|
of the sleep::
|
|
|
|
|
2016-06-21 22:16:57 +08:00
|
|
|
$ pytest -q
|
2017-11-23 23:33:41 +08:00
|
|
|
F [100%]
|
|
|
|
================================= FAILURES =================================
|
|
|
|
______________________________ test_function _______________________________
|
2018-05-18 16:19:46 +08:00
|
|
|
|
2015-07-11 16:13:27 +08:00
|
|
|
mydata = 42
|
2018-05-18 16:19:46 +08:00
|
|
|
|
2015-07-11 16:13:27 +08:00
|
|
|
def test_function(mydata):
|
|
|
|
> assert mydata == 23
|
|
|
|
E assert 42 == 23
|
2018-05-18 16:19:46 +08:00
|
|
|
|
2015-09-17 03:06:44 +08:00
|
|
|
test_caching.py:14: AssertionError
|
2015-09-22 20:02:11 +08:00
|
|
|
1 failed in 0.12 seconds
|
2015-07-11 16:13:27 +08:00
|
|
|
|
|
|
|
If you run it a second time the value will be retrieved from
|
|
|
|
the cache and this will be quick::
|
|
|
|
|
2016-06-21 22:16:57 +08:00
|
|
|
$ pytest -q
|
2017-11-23 23:33:41 +08:00
|
|
|
F [100%]
|
|
|
|
================================= FAILURES =================================
|
|
|
|
______________________________ test_function _______________________________
|
2018-05-18 16:19:46 +08:00
|
|
|
|
2015-07-11 16:13:27 +08:00
|
|
|
mydata = 42
|
2018-05-18 16:19:46 +08:00
|
|
|
|
2015-07-11 16:13:27 +08:00
|
|
|
def test_function(mydata):
|
|
|
|
> assert mydata == 23
|
|
|
|
E assert 42 == 23
|
2018-05-18 16:19:46 +08:00
|
|
|
|
2015-09-17 03:06:44 +08:00
|
|
|
test_caching.py:14: AssertionError
|
2015-09-22 20:02:11 +08:00
|
|
|
1 failed in 0.12 seconds
|
2015-07-11 16:13:27 +08:00
|
|
|
|
2018-02-09 05:57:26 +08:00
|
|
|
See the :ref:`cache-api` for more details.
|
2015-07-11 16:13:27 +08:00
|
|
|
|
|
|
|
|
|
|
|
Inspecting Cache content
|
|
|
|
-------------------------------
|
|
|
|
|
|
|
|
You can always peek at the content of the cache using the
|
2016-06-22 20:43:11 +08:00
|
|
|
``--cache-show`` command line option::
|
2015-09-17 02:41:22 +08:00
|
|
|
|
2018-05-13 18:09:47 +08:00
|
|
|
$ pytest --cache-show
|
2017-11-23 23:33:41 +08:00
|
|
|
=========================== test session starts ============================
|
2017-05-13 04:17:40 +08:00
|
|
|
platform linux -- Python 3.x.y, pytest-3.x.y, py-1.x.y, pluggy-0.x.y
|
2017-03-14 06:41:20 +08:00
|
|
|
rootdir: $REGENDOC_TMPDIR, inifile:
|
2018-01-31 03:47:56 +08:00
|
|
|
cachedir: $REGENDOC_TMPDIR/.pytest_cache
|
2016-08-02 02:46:34 +08:00
|
|
|
------------------------------- cache values -------------------------------
|
2017-05-20 06:12:59 +08:00
|
|
|
cache/lastfailed contains:
|
|
|
|
{'test_caching.py::test_function': True}
|
2018-03-22 04:46:07 +08:00
|
|
|
cache/nodeids contains:
|
|
|
|
['test_caching.py::test_function']
|
2017-05-23 14:01:39 +08:00
|
|
|
example/value contains:
|
|
|
|
42
|
2018-05-18 16:19:46 +08:00
|
|
|
|
2017-11-23 23:33:41 +08:00
|
|
|
======================= no tests ran in 0.12 seconds =======================
|
2015-07-11 16:13:27 +08:00
|
|
|
|
|
|
|
Clearing Cache content
|
|
|
|
-------------------------------
|
|
|
|
|
|
|
|
You can instruct pytest to clear all cache files and values
|
2015-09-17 02:41:22 +08:00
|
|
|
by adding the ``--cache-clear`` option like this::
|
2015-07-11 16:13:27 +08:00
|
|
|
|
2016-06-21 22:16:57 +08:00
|
|
|
pytest --cache-clear
|
2015-07-11 16:13:27 +08:00
|
|
|
|
2017-01-01 01:54:47 +08:00
|
|
|
This is recommended for invocations from Continuous Integration
|
2015-07-11 16:13:27 +08:00
|
|
|
servers where isolation and correctness is more important
|
|
|
|
than speed.
|