Merge pull request #1780 from nicoddemus/regen-pytest30

Run regen-docs for pytest 3.0
This commit is contained in:
Floris Bruynooghe 2016-08-02 12:07:16 +01:00 committed by GitHub
commit 9540106fe7
19 changed files with 116 additions and 110 deletions

View File

@ -26,7 +26,7 @@ you will see the return value of the function call::
$ pytest test_assert1.py
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 1 items
@ -170,7 +170,7 @@ if you run this module::
$ pytest test_assert2.py
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 1 items
@ -246,8 +246,7 @@ the conftest file::
f1 = Foo(1)
f2 = Foo(2)
> assert f1 == f2
E assert Comparing Foo instances:
E vals: 1 != 2
E AssertionError
test_foocompare.py:11: AssertionError
1 failed in 0.12 seconds

View File

@ -89,19 +89,23 @@ You can ask for available builtin or project-custom
Values can be any object handled by the json stdlib module.
capsys
enables capturing of writes to sys.stdout/sys.stderr and makes
Enable capturing of writes to sys.stdout/sys.stderr and make
captured output available via ``capsys.readouterr()`` method calls
which return a ``(out, err)`` tuple.
capfd
enables capturing of writes to file descriptors 1 and 2 and makes
Enable capturing of writes to file descriptors 1 and 2 and make
captured output available via ``capfd.readouterr()`` method calls
which return a ``(out, err)`` tuple.
doctest_namespace
Inject names into the doctest namespace.
pytestconfig
the pytest config object with access to command line opts.
record_xml_property
Fixture that adds extra xml properties to the tag for the calling test.
The fixture is callable with (name, value), with value being automatically
Add extra xml properties to the tag for the calling test.
The fixture is callable with ``(name, value)``, with value being automatically
xml-encoded.
monkeypatch
The returned ``monkeypatch`` funcarg provides these
The returned ``monkeypatch`` fixture provides these
helper methods to modify objects, dictionaries or os.environ::
monkeypatch.setattr(obj, name, value, raising=True)
@ -114,11 +118,11 @@ You can ask for available builtin or project-custom
monkeypatch.chdir(path)
All modifications will be undone after the requesting
test function has finished. The ``raising``
test function or fixture has finished. The ``raising``
parameter determines if a KeyError or AttributeError
will be raised if the set/deletion operation has no target.
pytestconfig
the pytest config object with access to command line opts.
This fixture is ``invocation``-scoped.
recwarn
Return a WarningsRecorder instance that provides these methods:
@ -130,7 +134,7 @@ You can ask for available builtin or project-custom
tmpdir_factory
Return a TempdirFactory instance for the test session.
tmpdir
return a temporary directory path object
Return a temporary directory path object
which is unique to each test function invocation,
created as a sub directory of the base temporary
directory. The returned object is a `py.path.local`_

View File

@ -80,7 +80,7 @@ If you then run it with ``--lf``::
$ pytest --lf
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
run-last-failure: rerun last 2 failures
rootdir: $REGENDOC_TMPDIR, inifile:
collected 50 items
@ -121,7 +121,7 @@ of ``FF`` and dots)::
$ pytest --ff
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
run-last-failure: rerun last 2 failures first
rootdir: $REGENDOC_TMPDIR, inifile:
collected 50 items
@ -226,23 +226,16 @@ You can always peek at the content of the cache using the
$ py.test --cache-show
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 1 items
cachedir: $REGENDOC_TMPDIR/.cache
------------------------------- cache values -------------------------------
cache/lastfailed contains:
{'test_caching.py::test_function': True}
example/value contains:
42
test_caching.py F
======= FAILURES ========
_______ test_function ________
mydata = 42
def test_function(mydata):
> assert mydata == 23
E assert 42 == 23
test_caching.py:14: AssertionError
======= 1 failed in 0.12 seconds ========
======= no tests ran in 0.12 seconds ========
Clearing Cache content
-------------------------------

View File

@ -64,7 +64,7 @@ of the failing function and hide the other one::
$ pytest
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 2 items

View File

@ -48,7 +48,7 @@ then you can just invoke ``pytest`` without command line options::
$ pytest
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile: pytest.ini
collected 1 items

View File

@ -31,7 +31,7 @@ You can then restrict a test run to only run tests marked with ``webtest``::
$ pytest -v -m webtest
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.5
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.5
cachedir: .cache
rootdir: $REGENDOC_TMPDIR, inifile:
collecting ... collected 4 items
@ -45,7 +45,7 @@ Or the inverse, running all tests except the webtest ones::
$ pytest -v -m "not webtest"
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.5
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.5
cachedir: .cache
rootdir: $REGENDOC_TMPDIR, inifile:
collecting ... collected 4 items
@ -66,7 +66,7 @@ tests based on their module, class, method, or function name::
$ pytest -v test_server.py::TestClass::test_method
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.5
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.5
cachedir: .cache
rootdir: $REGENDOC_TMPDIR, inifile:
collecting ... collected 5 items
@ -79,7 +79,7 @@ You can also select on the class::
$ pytest -v test_server.py::TestClass
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.5
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.5
cachedir: .cache
rootdir: $REGENDOC_TMPDIR, inifile:
collecting ... collected 4 items
@ -92,7 +92,7 @@ Or select multiple nodes::
$ pytest -v test_server.py::TestClass test_server.py::test_send_http
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.5
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.5
cachedir: .cache
rootdir: $REGENDOC_TMPDIR, inifile:
collecting ... collected 8 items
@ -130,7 +130,7 @@ select tests based on their names::
$ pytest -v -k http # running with the above defined example module
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.5
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.5
cachedir: .cache
rootdir: $REGENDOC_TMPDIR, inifile:
collecting ... collected 4 items
@ -144,7 +144,7 @@ And you can also run all tests except the ones that match the keyword::
$ pytest -k "not send_http" -v
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.5
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.5
cachedir: .cache
rootdir: $REGENDOC_TMPDIR, inifile:
collecting ... collected 4 items
@ -160,7 +160,7 @@ Or to select "http" and "quick" tests::
$ pytest -k "http or quick" -v
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.5
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.5
cachedir: .cache
rootdir: $REGENDOC_TMPDIR, inifile:
collecting ... collected 4 items
@ -352,7 +352,7 @@ the test needs::
$ pytest -E stage2
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 1 items
@ -364,7 +364,7 @@ and here is one that specifies exactly the environment needed::
$ pytest -E stage1
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 1 items
@ -485,7 +485,7 @@ then you will see two test skipped and two executed tests as expected::
$ pytest -rs # this option reports skip reasons
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 4 items
@ -499,7 +499,7 @@ Note that if you specify a platform via the marker-command line option like this
$ pytest -m linux2
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 4 items
@ -551,7 +551,7 @@ We can now use the ``-m option`` to select one set::
$ pytest -m interface --tb=short
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 4 items
@ -573,7 +573,7 @@ or to select both "event" and "interface" tests::
$ pytest -m "interface or event" --tb=short
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 4 items

View File

@ -27,11 +27,11 @@ now execute the test specification::
nonpython $ pytest test_simple.yml
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR/nonpython, inifile:
collected 2 items
test_simple.yml .F
test_simple.yml F.
======= FAILURES ========
_______ usecase: hello ________
@ -59,13 +59,13 @@ consulted when reporting in ``verbose`` mode::
nonpython $ pytest -v
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.5
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.5
cachedir: .cache
rootdir: $REGENDOC_TMPDIR/nonpython, inifile:
collecting ... collected 2 items
test_simple.yml::ok PASSED
test_simple.yml::hello FAILED
test_simple.yml::ok PASSED
======= FAILURES ========
_______ usecase: hello ________
@ -81,11 +81,11 @@ interesting to just look at the collection tree::
nonpython $ pytest --collect-only
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR/nonpython, inifile:
collected 2 items
<YamlFile 'test_simple.yml'>
<YamlItem 'ok'>
<YamlItem 'hello'>
<YamlItem 'ok'>
======= no tests ran in 0.12 seconds ========

View File

@ -10,7 +10,7 @@ class YamlFile(pytest.File):
def collect(self):
import yaml # we need a yaml parser, e.g. PyYAML
raw = yaml.safe_load(self.fspath.open())
for name, spec in raw.items():
for name, spec in sorted(raw.items()):
yield YamlItem(name, self, spec)
class YamlItem(pytest.Item):
@ -19,7 +19,7 @@ class YamlItem(pytest.Item):
self.spec = spec
def runtest(self):
for name, value in self.spec.items():
for name, value in sorted(self.spec.items()):
# some custom test execution (dumb example follows)
if name != value:
raise YamlException(self, name, value)

View File

@ -130,7 +130,7 @@ objects, they are still using the default pytest representation::
$ pytest test_time.py --collect-only
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 6 items
<Module 'test_time.py'>
@ -181,7 +181,7 @@ this is a fully self-contained example which you can run with::
$ pytest test_scenarios.py
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 4 items
@ -194,7 +194,7 @@ If you just collect tests you'll also nicely see 'advanced' and 'basic' as varia
$ pytest --collect-only test_scenarios.py
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 4 items
<Module 'test_scenarios.py'>
@ -259,7 +259,7 @@ Let's first see how it looks like at collection time::
$ pytest test_backends.py --collect-only
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 2 items
<Module 'test_backends.py'>
@ -320,7 +320,7 @@ The result of this test will be successful::
$ pytest test_indirect_list.py --collect-only
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 1 items
<Module 'test_indirect_list.py'>
@ -369,7 +369,7 @@ argument sets to use for each test function. Let's run it::
$ pytest -q
F..
======= FAILURES ========
_______ TestClass.test_equals[1-2] ________
_______ TestClass.test_equals[2-1] ________
self = <test_parametrize.TestClass object at 0xdeadbeef>, a = 1, b = 2
@ -397,8 +397,11 @@ is to be run with different sets of arguments for its three arguments:
Running it results in some skips if we don't have all the python interpreters installed and otherwise runs all combinations (5 interpreters times 5 interpreters times 3 objects to serialize/deserialize)::
. $ pytest -rs -q multipython.py
...........................
27 passed in 0.12 seconds
ssssssssssss...ssssssssssss
======= short test summary info ========
SKIP [12] $REGENDOC_TMPDIR/CWD/multipython.py:23: 'python3.3' not found
SKIP [12] $REGENDOC_TMPDIR/CWD/multipython.py:23: 'python2.6' not found
3 passed, 24 skipped in 0.12 seconds
Indirect parametrization of optional implementations/imports
--------------------------------------------------------------------
@ -445,7 +448,7 @@ If you run this with reporting for skips enabled::
$ pytest -rs test_module.py
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 2 items

View File

@ -116,7 +116,7 @@ then the test collection looks like this::
$ pytest --collect-only
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile: setup.cfg
collected 2 items
<Module 'check_myapp.py'>
@ -162,7 +162,7 @@ You can always peek at the collection tree without running tests like this::
. $ pytest --collect-only pythoncollection.py
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile: pytest.ini
collected 3 items
<Module 'CWD/pythoncollection.py'>
@ -229,7 +229,7 @@ will be left out::
$ pytest --collect-only
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile: pytest.ini
collected 0 items

View File

@ -13,7 +13,7 @@ get on the terminal - we are working on that):
assertion $ pytest failure_demo.py
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR/assertion, inifile:
collected 42 items
@ -361,7 +361,7 @@ get on the terminal - we are working on that):
> int(s)
E ValueError: invalid literal for int() with base 10: 'qwe'
<0-codegen $PYTHON_PREFIX/lib/python3.5/site-packages/_pytest/python.py:1309>:1: ValueError
<0-codegen $PYTHON_PREFIX/lib/python3.5/site-packages/_pytest/python.py:1176>:1: ValueError
_______ TestRaises.test_raises_doesnt ________
self = <failure_demo.TestRaises object at 0xdeadbeef>
@ -427,7 +427,7 @@ get on the terminal - we are working on that):
def foo():
> assert 1 == 0
E assert 1 == 0
E AssertionError
<2-codegen 'abc-123' $REGENDOC_TMPDIR/assertion/failure_demo.py:163>:2: AssertionError
_______ TestMoreErrors.test_complex_error ________
@ -482,8 +482,9 @@ get on the terminal - we are working on that):
s = "123"
g = "456"
> assert s.startswith(g)
E assert <built-in method startswith of str object at 0xdeadbeef>('456')
E + where <built-in method startswith of str object at 0xdeadbeef> = '123'.startswith
E assert False
E + where False = <built-in method startswith of str object at 0xdeadbeef>('456')
E + where <built-in method startswith of str object at 0xdeadbeef> = '123'.startswith
failure_demo.py:189: AssertionError
_______ TestMoreErrors.test_startswith_nested ________
@ -496,10 +497,11 @@ get on the terminal - we are working on that):
def g():
return "456"
> assert f().startswith(g())
E assert <built-in method startswith of str object at 0xdeadbeef>('456')
E + where <built-in method startswith of str object at 0xdeadbeef> = '123'.startswith
E + where '123' = <function TestMoreErrors.test_startswith_nested.<locals>.f at 0xdeadbeef>()
E + and '456' = <function TestMoreErrors.test_startswith_nested.<locals>.g at 0xdeadbeef>()
E assert False
E + where False = <built-in method startswith of str object at 0xdeadbeef>('456')
E + where <built-in method startswith of str object at 0xdeadbeef> = '123'.startswith
E + where '123' = <function TestMoreErrors.test_startswith_nested.<locals>.f at 0xdeadbeef>()
E + and '456' = <function TestMoreErrors.test_startswith_nested.<locals>.g at 0xdeadbeef>()
failure_demo.py:196: AssertionError
_______ TestMoreErrors.test_global_func ________
@ -508,8 +510,9 @@ get on the terminal - we are working on that):
def test_global_func(self):
> assert isinstance(globf(42), float)
E assert isinstance(43, float)
E + where 43 = globf(42)
E assert False
E + where False = isinstance(43, float)
E + where 43 = globf(42)
failure_demo.py:199: AssertionError
_______ TestMoreErrors.test_instance ________

View File

@ -108,7 +108,7 @@ directory with the above conftest.py::
$ pytest
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 0 items
@ -156,7 +156,7 @@ and when running it will see a skipped "slow" test::
$ pytest -rs # "-rs" means report details on the little 's'
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 2 items
@ -170,7 +170,7 @@ Or run it including the ``slow`` marked test::
$ pytest --runslow
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 2 items
@ -284,7 +284,7 @@ which will add the string to the test header accordingly::
$ pytest
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
project deps: mylib-1.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 0 items
@ -308,7 +308,7 @@ which will add info only when run with "--v"::
$ pytest -v
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.5
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.5
cachedir: .cache
info1: did you know that ...
did you?
@ -321,7 +321,7 @@ and nothing when run plainly::
$ pytest
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 0 items
@ -354,7 +354,7 @@ Now we can profile which test functions execute the slowest::
$ pytest --durations=3
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 3 items
@ -363,7 +363,7 @@ Now we can profile which test functions execute the slowest::
======= slowest 3 test durations ========
0.20s call test_some_are_slow.py::test_funcslow2
0.10s call test_some_are_slow.py::test_funcslow1
0.00s teardown test_some_are_slow.py::test_funcslow2
0.00s setup test_some_are_slow.py::test_funcfast
======= 3 passed in 0.12 seconds ========
incremental testing - test steps
@ -416,7 +416,7 @@ If we run this::
$ pytest -rx
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 4 items
@ -487,7 +487,7 @@ We can run this::
$ pytest
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 7 items
@ -500,8 +500,8 @@ We can run this::
_______ ERROR at setup of test_root ________
file $REGENDOC_TMPDIR/b/test_error.py, line 1
def test_root(db): # no db here, will error out
fixture 'db' not found
available fixtures: tmpdir_factory, cache, tmpdir, pytestconfig, recwarn, monkeypatch, capfd, record_xml_property, capsys
E fixture 'db' not found
available fixtures: monkeypatch, capfd, recwarn, pytestconfig, tmpdir_factory, tmpdir, cache, capsys, record_xml_property, doctest_namespace
use 'pytest --fixtures [testpath]' for help on them.
$REGENDOC_TMPDIR/b/test_error.py:1
@ -591,7 +591,7 @@ and run them::
$ pytest test_module.py
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 2 items
@ -681,7 +681,7 @@ and run it::
$ pytest -s test_module.py
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 3 items

View File

@ -70,7 +70,7 @@ marked ``smtp`` fixture function. Running the test looks like this::
$ pytest test_smtpsimple.py
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 1 items
@ -188,7 +188,7 @@ inspect what is going on and can now run the tests::
$ pytest test_module.py
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 2 items
@ -352,8 +352,8 @@ again, nothing much has changed::
$ pytest -s -q --tb=no
FFfinalizing <smtplib.SMTP object at 0xdeadbeef> (smtp.gmail.com)
2 failed in 0.12 seconds
.
2 failed, 1 passed in 0.12 seconds
Let's quickly create another test module that actually sets the
server URL in its module namespace::
@ -516,9 +516,9 @@ Running the above tests results in the following test IDs being used::
$ pytest --collect-only
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 10 items
collected 11 items
<Module 'test_anothersmtp.py'>
<Function 'test_showhelo[smtp.gmail.com]'>
<Function 'test_showhelo[mail.python.org]'>
@ -532,6 +532,8 @@ Running the above tests results in the following test IDs being used::
<Function 'test_noop[smtp.gmail.com]'>
<Function 'test_ehlo[mail.python.org]'>
<Function 'test_noop[mail.python.org]'>
<Module 'test_yield2.py'>
<Function 'test_has_lines'>
======= no tests ran in 0.12 seconds ========
@ -567,7 +569,7 @@ Here we declare an ``app`` fixture which receives the previously defined
$ pytest -v test_appsetup.py
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.5
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.5
cachedir: .cache
rootdir: $REGENDOC_TMPDIR, inifile:
collecting ... collected 2 items
@ -636,7 +638,7 @@ Let's run the tests in verbose mode and with looking at the print-output::
$ pytest -v -s test_module.py
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.5
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.5
cachedir: .cache
rootdir: $REGENDOC_TMPDIR, inifile:
collecting ... collected 8 items

View File

@ -27,7 +27,7 @@ Installation options::
To check your installation has installed the correct version::
$ pytest --version
This is pytest version 2.9.2, imported from $PYTHON_PREFIX/lib/python3.5/site-packages/pytest.py
This is pytest version 3.0.0, imported from $PYTHON_PREFIX/lib/python3.5/site-packages/pytest.py
If you get an error checkout :ref:`installation issues`.
@ -49,7 +49,7 @@ That's it. You can execute the test function now::
$ pytest
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 1 items
@ -137,7 +137,8 @@ run the module by passing its filename::
def test_two(self):
x = "hello"
> assert hasattr(x, 'check')
E assert hasattr('hello', 'check')
E assert False
E + where False = hasattr('hello', 'check')
test_class.py:8: AssertionError
1 failed, 1 passed in 0.12 seconds

View File

@ -55,7 +55,7 @@ them in turn::
$ pytest
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 3 items
@ -103,7 +103,7 @@ Let's run this::
$ pytest
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 3 items
@ -186,8 +186,9 @@ Let's also run with a stringinput that will lead to a failing test::
def test_valid_string(stringinput):
> assert stringinput.isalpha()
E assert <built-in method isalpha of str object at 0xdeadbeef>()
E + where <built-in method isalpha of str object at 0xdeadbeef> = '!'.isalpha
E assert False
E + where False = <built-in method isalpha of str object at 0xdeadbeef>()
E + where <built-in method isalpha of str object at 0xdeadbeef> = '!'.isalpha
test_strings.py:3: AssertionError
1 failed in 0.12 seconds

View File

@ -224,7 +224,7 @@ Running it with the report-on-xfail option gives this output::
example $ pytest -rx xfail_demo.py
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR/example, inifile:
collected 7 items

View File

@ -29,7 +29,7 @@ Running this would result in a passed test except for the last
$ pytest test_tmpdir.py
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 1 items

View File

@ -88,7 +88,7 @@ the ``self.db`` values in the traceback::
$ pytest test_unittest_db.py
======= test session starts ========
platform linux -- Python 3.5.1, pytest-2.9.2, py-1.4.31, pluggy-0.3.1
platform linux -- Python 3.5.2, pytest-3.0.0, py-1.4.31, pluggy-0.3.1
rootdir: $REGENDOC_TMPDIR, inifile:
collected 2 items

View File

@ -318,7 +318,7 @@ You can specify additional plugins to ``pytest.main``::
def pytest_sessionfinish(self):
print("*** test run reporting finishing")
pytest.main("-qq", plugins=[MyPlugin()])
pytest.main(["-qq"], plugins=[MyPlugin()])
Running it will show that ``MyPlugin`` was added and its
hook was invoked::