2017-07-07 18:08:12 +08:00
.. _`cache_provider`:
2017-07-20 08:11:17 +08:00
.. _cache:
2017-07-07 18:08:12 +08:00
2021-03-15 16:22:11 +08:00
How to re-run failed tests and maintain state between test runs
===============================================================
2015-07-11 16:13:27 +08:00
2019-04-28 23:37:58 +08:00
2015-08-18 01:17:39 +08:00
2015-07-11 16:13:27 +08:00
Usage
---------
2015-09-17 02:41:22 +08:00
The plugin provides two command line options to rerun failures from the
2016-06-21 22:16:57 +08:00
last `` pytest `` invocation:
2015-07-11 16:13:27 +08:00
2016-01-28 05:57:11 +08:00
* `` --lf `` , `` --last-failed `` - to only re-run the failures.
* `` --ff `` , `` --failed-first `` - to run the failures first and then the rest of
2015-07-11 16:13:27 +08:00
the tests.
2015-09-17 02:41:22 +08:00
For cleanup (usually not needed), a `` --cache-clear `` option allows to remove
2015-07-11 16:13:27 +08:00
all cross-session cache contents ahead of a test run.
2018-05-18 16:19:46 +08:00
Other plugins may access the `config.cache`_ object to set/get
2016-06-21 22:16:57 +08:00
**json encodable** values between `` pytest `` invocations.
2015-09-17 02:41:22 +08:00
2015-09-29 05:23:08 +08:00
.. note ::
This plugin is enabled by default, but can be disabled if needed: see
:ref: `cmdunregister` (the internal name for this plugin is
`` cacheprovider `` ).
2015-07-11 16:13:27 +08:00
Rerunning only failures or failures first
-----------------------------------------------
2019-08-07 07:20:06 +08:00
First, let's create 50 test invocation of which only 2 fail:
2015-07-11 16:13:27 +08:00
2019-08-07 04:25:54 +08:00
.. code-block :: python
2015-07-11 16:13:27 +08:00
# content of test_50.py
import pytest
2019-08-07 04:34:58 +08:00
2015-07-11 16:13:27 +08:00
@pytest.mark.parametrize("i", range(50))
def test_num(i):
if i in (17, 25):
2019-08-07 04:34:58 +08:00
pytest.fail("bad luck")
2015-07-11 16:13:27 +08:00
2018-11-24 13:41:22 +08:00
If you run this for the first time you will see two failures:
.. code-block :: pytest
2015-07-11 16:13:27 +08:00
2016-06-21 22:16:57 +08:00
$ pytest -q
2019-01-06 03:19:40 +08:00
.................F.......F........................ [100%]
================================= FAILURES =================================
_______________________________ test_num[17] _______________________________
2018-05-18 16:19:46 +08:00
2015-07-11 16:13:27 +08:00
i = 17
2018-05-18 16:19:46 +08:00
2015-07-11 16:13:27 +08:00
@pytest.mark.parametrize("i", range(50))
def test_num(i):
2015-09-17 02:41:22 +08:00
if i in (17, 25):
2019-08-16 08:00:09 +08:00
> pytest.fail("bad luck")
E Failed: bad luck
2018-05-18 16:19:46 +08:00
2019-08-16 08:00:09 +08:00
test_50.py:7: Failed
2019-01-06 03:19:40 +08:00
_______________________________ test_num[25] _______________________________
2018-05-18 16:19:46 +08:00
2015-07-11 16:13:27 +08:00
i = 25
2018-05-18 16:19:46 +08:00
2015-07-11 16:13:27 +08:00
@pytest.mark.parametrize("i", range(50))
def test_num(i):
2015-09-17 02:41:22 +08:00
if i in (17, 25):
2019-08-16 08:00:09 +08:00
> pytest.fail("bad luck")
E Failed: bad luck
2018-05-18 16:19:46 +08:00
2019-08-16 08:00:09 +08:00
test_50.py:7: Failed
2020-03-11 22:23:25 +08:00
========================= short test summary info ==========================
FAILED test_50.py::test_num[17] - Failed: bad luck
FAILED test_50.py::test_num[25] - Failed: bad luck
2019-09-18 21:11:59 +08:00
2 failed, 48 passed in 0.12s
2015-07-11 16:13:27 +08:00
2018-11-24 13:41:22 +08:00
If you then run it with `` --lf `` :
.. code-block :: pytest
2015-07-11 16:13:27 +08:00
2016-06-21 22:16:57 +08:00
$ pytest --lf
2019-01-06 03:19:40 +08:00
=========================== test session starts ============================
2024-01-02 16:58:20 +08:00
platform linux -- Python 3.x.y, pytest-8.x.y, pluggy-1.x.y
2021-10-04 14:56:26 +08:00
rootdir: /home/sweet/project
2020-03-11 22:23:25 +08:00
collected 2 items
2017-07-31 05:37:18 +08:00
run-last-failure: rerun previous 2 failures
2018-05-18 16:19:46 +08:00
2019-01-06 03:19:40 +08:00
test_50.py FF [100%]
2018-05-18 16:19:46 +08:00
2019-01-06 03:19:40 +08:00
================================= FAILURES =================================
_______________________________ test_num[17] _______________________________
2018-05-18 16:19:46 +08:00
2015-07-11 16:13:27 +08:00
i = 17
2018-05-18 16:19:46 +08:00
2015-07-11 16:13:27 +08:00
@pytest.mark.parametrize("i", range(50))
def test_num(i):
2015-09-17 02:41:22 +08:00
if i in (17, 25):
2019-08-16 08:00:09 +08:00
> pytest.fail("bad luck")
E Failed: bad luck
2018-05-18 16:19:46 +08:00
2019-08-16 08:00:09 +08:00
test_50.py:7: Failed
2019-01-06 03:19:40 +08:00
_______________________________ test_num[25] _______________________________
2018-05-18 16:19:46 +08:00
2015-07-11 16:13:27 +08:00
i = 25
2018-05-18 16:19:46 +08:00
2015-07-11 16:13:27 +08:00
@pytest.mark.parametrize("i", range(50))
def test_num(i):
2015-09-17 02:41:22 +08:00
if i in (17, 25):
2019-08-16 08:00:09 +08:00
> pytest.fail("bad luck")
E Failed: bad luck
2018-05-18 16:19:46 +08:00
2019-08-16 08:00:09 +08:00
test_50.py:7: Failed
2020-03-11 22:23:25 +08:00
========================= short test summary info ==========================
FAILED test_50.py::test_num[17] - Failed: bad luck
FAILED test_50.py::test_num[25] - Failed: bad luck
============================ 2 failed in 0.12s =============================
2015-07-11 16:13:27 +08:00
2019-06-22 06:48:59 +08:00
You have run only the two failing tests from the last run, while the 48 passing
tests have not been run ("deselected").
2015-07-11 16:13:27 +08:00
2015-09-17 02:44:41 +08:00
Now, if you run with the `` --ff `` option, all tests will be run but the first
previous failures will be executed first (as can be seen from the series
2018-11-24 13:41:22 +08:00
of `` FF `` and dots):
.. code-block :: pytest
2015-07-11 16:13:27 +08:00
2016-06-21 22:16:57 +08:00
$ pytest --ff
2019-01-06 03:19:40 +08:00
=========================== test session starts ============================
2024-01-02 16:58:20 +08:00
platform linux -- Python 3.x.y, pytest-8.x.y, pluggy-1.x.y
2021-10-04 14:56:26 +08:00
rootdir: /home/sweet/project
2015-07-11 16:13:27 +08:00
collected 50 items
2017-07-31 05:37:18 +08:00
run-last-failure: rerun previous 2 failures first
2018-05-18 16:19:46 +08:00
2019-01-06 03:19:40 +08:00
test_50.py FF................................................ [100%]
2018-05-18 16:19:46 +08:00
2019-01-06 03:19:40 +08:00
================================= FAILURES =================================
_______________________________ test_num[17] _______________________________
2018-05-18 16:19:46 +08:00
2015-07-11 16:13:27 +08:00
i = 17
2018-05-18 16:19:46 +08:00
2015-07-11 16:13:27 +08:00
@pytest.mark.parametrize("i", range(50))
def test_num(i):
2015-09-17 02:41:22 +08:00
if i in (17, 25):
2019-08-16 08:00:09 +08:00
> pytest.fail("bad luck")
E Failed: bad luck
2018-05-18 16:19:46 +08:00
2019-08-16 08:00:09 +08:00
test_50.py:7: Failed
2019-01-06 03:19:40 +08:00
_______________________________ test_num[25] _______________________________
2018-05-18 16:19:46 +08:00
2015-07-11 16:13:27 +08:00
i = 25
2018-05-18 16:19:46 +08:00
2015-07-11 16:13:27 +08:00
@pytest.mark.parametrize("i", range(50))
def test_num(i):
2015-09-17 02:41:22 +08:00
if i in (17, 25):
2019-08-16 08:00:09 +08:00
> pytest.fail("bad luck")
E Failed: bad luck
2018-05-18 16:19:46 +08:00
2019-08-16 08:00:09 +08:00
test_50.py:7: Failed
2020-03-11 22:23:25 +08:00
========================= short test summary info ==========================
FAILED test_50.py::test_num[17] - Failed: bad luck
FAILED test_50.py::test_num[25] - Failed: bad luck
2019-08-30 23:43:47 +08:00
======================= 2 failed, 48 passed in 0.12s =======================
2015-07-11 16:13:27 +08:00
.. _`config.cache`:
2018-02-24 03:49:17 +08:00
New `` --nf `` , `` --new-first `` options: run new tests first followed by the rest
of the tests, in both cases tests are also sorted by the file modified time,
with more recent files coming first.
2018-03-11 04:45:45 +08:00
Behavior when no tests failed in the last run
---------------------------------------------
2023-08-29 08:14:45 +08:00
The `` --lfnf/--last-failed-no-failures `` option governs the behavior of `` --last-failed `` .
Determines whether to execute tests when there are no previously (known)
failures or when no cached `` lastfailed `` data was found.
There are two options:
* `` all `` : when there are no known test failures, runs all tests (the full test suite). This is the default.
* `` none `` : when there are no known test failures, just emits a message stating this and exit successfully.
Example:
2019-02-15 21:10:37 +08:00
.. code-block :: bash
2018-03-11 04:45:45 +08:00
2023-08-29 08:14:45 +08:00
pytest --last-failed --last-failed-no-failures all # runs the full test suite (default behavior)
pytest --last-failed --last-failed-no-failures none # runs no tests and exits successfully
2018-03-11 04:45:45 +08:00
2015-07-11 16:13:27 +08:00
The new config.cache object
--------------------------------
.. regendoc:wipe
2015-09-17 03:06:44 +08:00
Plugins or conftest.py support code can get a cached value using the
pytest `` config `` object. Here is a basic example plugin which
2020-04-01 21:43:54 +08:00
implements a :ref: `fixture <fixture>` which re-uses previously created state
2019-08-07 07:20:06 +08:00
across pytest invocations:
2015-07-11 16:13:27 +08:00
2019-08-07 04:25:54 +08:00
.. code-block :: python
2015-07-11 16:13:27 +08:00
# content of test_caching.py
2015-09-17 03:06:44 +08:00
import pytest
2015-07-11 16:13:27 +08:00
2019-08-07 04:34:58 +08:00
2018-12-21 12:37:22 +08:00
def expensive_computation():
print("running expensive computation...")
2019-08-07 04:34:58 +08:00
2015-09-17 03:06:44 +08:00
@pytest.fixture
2023-10-24 02:45:16 +08:00
def mydata(pytestconfig):
val = pytestconfig.cache.get("example/value", None)
2015-07-11 16:13:27 +08:00
if val is None:
2018-12-21 12:37:22 +08:00
expensive_computation()
2015-07-11 16:13:27 +08:00
val = 42
2023-10-24 02:45:16 +08:00
pytestconfig.cache.set("example/value", val)
2015-07-11 16:13:27 +08:00
return val
2019-08-07 04:34:58 +08:00
2015-07-11 16:13:27 +08:00
def test_function(mydata):
assert mydata == 23
2018-12-21 12:37:22 +08:00
If you run this command for the first time, you can see the print statement:
2018-11-24 13:41:22 +08:00
.. code-block :: pytest
2015-07-11 16:13:27 +08:00
2016-06-21 22:16:57 +08:00
$ pytest -q
2019-01-06 03:19:40 +08:00
F [100%]
================================= FAILURES =================================
______________________________ test_function _______________________________
2018-05-18 16:19:46 +08:00
2015-07-11 16:13:27 +08:00
mydata = 42
2018-05-18 16:19:46 +08:00
2015-07-11 16:13:27 +08:00
def test_function(mydata):
> assert mydata == 23
E assert 42 == 23
2018-05-18 16:19:46 +08:00
2022-10-25 19:12:55 +08:00
test_caching.py:19: AssertionError
2019-04-04 00:11:14 +08:00
-------------------------- Captured stdout setup ---------------------------
running expensive computation...
2020-03-11 22:23:25 +08:00
========================= short test summary info ==========================
FAILED test_caching.py::test_function - assert 42 == 23
2019-09-18 21:11:59 +08:00
1 failed in 0.12s
2015-07-11 16:13:27 +08:00
2019-06-22 06:48:59 +08:00
If you run it a second time, the value will be retrieved from
2018-12-21 12:37:22 +08:00
the cache and nothing will be printed:
2018-11-24 13:41:22 +08:00
.. code-block :: pytest
2015-07-11 16:13:27 +08:00
2016-06-21 22:16:57 +08:00
$ pytest -q
2019-01-06 03:19:40 +08:00
F [100%]
================================= FAILURES =================================
______________________________ test_function _______________________________
2018-05-18 16:19:46 +08:00
2015-07-11 16:13:27 +08:00
mydata = 42
2018-05-18 16:19:46 +08:00
2015-07-11 16:13:27 +08:00
def test_function(mydata):
> assert mydata == 23
E assert 42 == 23
2018-05-18 16:19:46 +08:00
2022-10-25 19:12:55 +08:00
test_caching.py:19: AssertionError
2020-03-11 22:23:25 +08:00
========================= short test summary info ==========================
FAILED test_caching.py::test_function - assert 42 == 23
2019-09-18 21:11:59 +08:00
1 failed in 0.12s
2015-07-11 16:13:27 +08:00
2020-09-06 16:52:30 +08:00
See the :fixture: `config.cache fixture <cache>` for more details.
2015-07-11 16:13:27 +08:00
Inspecting Cache content
2019-04-03 21:56:42 +08:00
------------------------
2015-07-11 16:13:27 +08:00
You can always peek at the content of the cache using the
2018-11-24 13:41:22 +08:00
`` --cache-show `` command line option:
.. code-block :: pytest
2015-09-17 02:41:22 +08:00
2018-05-13 18:09:47 +08:00
$ pytest --cache-show
2019-01-06 03:19:40 +08:00
=========================== test session starts ============================
2024-01-02 16:58:20 +08:00
platform linux -- Python 3.x.y, pytest-8.x.y, pluggy-1.x.y
2021-10-04 14:56:26 +08:00
rootdir: /home/sweet/project
cachedir: /home/sweet/project/.pytest_cache
2019-04-03 21:56:42 +08:00
--------------------------- cache values for '*' ---------------------------
2017-05-20 06:12:59 +08:00
cache/lastfailed contains:
2021-10-04 14:56:26 +08:00
{'test_caching.py::test_function': True}
2018-03-22 04:46:07 +08:00
cache/nodeids contains:
2021-10-04 14:56:26 +08:00
['test_caching.py::test_function']
2018-11-03 21:51:39 +08:00
cache/stepwise contains:
[]
2017-05-23 14:01:39 +08:00
example/value contains:
42
2018-05-18 16:19:46 +08:00
2019-08-30 23:43:47 +08:00
========================== no tests ran in 0.12s ===========================
2015-07-11 16:13:27 +08:00
2019-04-03 21:56:42 +08:00
`` --cache-show `` takes an optional argument to specify a glob pattern for
filtering:
.. code-block :: pytest
$ pytest --cache-show example/*
=========================== test session starts ============================
2024-01-02 16:58:20 +08:00
platform linux -- Python 3.x.y, pytest-8.x.y, pluggy-1.x.y
2021-10-04 14:56:26 +08:00
rootdir: /home/sweet/project
cachedir: /home/sweet/project/.pytest_cache
2019-04-03 21:56:42 +08:00
----------------------- cache values for 'example/*' -----------------------
example/value contains:
42
2019-08-30 23:43:47 +08:00
========================== no tests ran in 0.12s ===========================
2019-04-03 21:56:42 +08:00
2015-07-11 16:13:27 +08:00
Clearing Cache content
2019-04-03 21:56:42 +08:00
----------------------
2015-07-11 16:13:27 +08:00
You can instruct pytest to clear all cache files and values
2019-02-15 21:10:37 +08:00
by adding the `` --cache-clear `` option like this:
.. code-block :: bash
2015-07-11 16:13:27 +08:00
2016-06-21 22:16:57 +08:00
pytest --cache-clear
2015-07-11 16:13:27 +08:00
2017-01-01 01:54:47 +08:00
This is recommended for invocations from Continuous Integration
2015-07-11 16:13:27 +08:00
servers where isolation and correctness is more important
than speed.
2018-10-15 01:50:06 +08:00
2021-09-27 06:42:37 +08:00
.. _cache stepwise:
2018-10-15 01:50:06 +08:00
Stepwise
--------
2021-08-31 02:24:14 +08:00
As an alternative to `` --lf -x `` , especially for cases where you expect a large part of the test suite will fail, `` --sw `` , `` --stepwise `` allows you to fix them one at a time. The test suite will run until the first failure and then stop. At the next invocation, tests will continue from the last failing test and then run until the next failing test. You may use the `` --stepwise-skip `` option to ignore one failing test and stop the test execution on the second failing test instead. This is useful if you get stuck on a failing test and just want to ignore it until later. Providing `` --stepwise-skip `` will also enable `` --stepwise `` implicitly.