Merge pull request #1250 from nicoddemus/merge-master-into-features
Merge master into features
This commit is contained in:
commit
926c6028bb
3
AUTHORS
3
AUTHORS
|
@ -35,6 +35,7 @@ Eric Hunsberger
|
|||
Eric Siegerman
|
||||
Florian Bruhin
|
||||
Floris Bruynooghe
|
||||
Gabriel Reis
|
||||
Graham Horler
|
||||
Grig Gheorghiu
|
||||
Guido Wesdorp
|
||||
|
@ -47,6 +48,7 @@ Jason R. Coombs
|
|||
Jurko Gospodnetić
|
||||
Katarzyna Jachim
|
||||
Kevin Cox
|
||||
Lee Kamentsky
|
||||
Maciek Fijalkowski
|
||||
Maho
|
||||
Marc Schlaich
|
||||
|
@ -54,6 +56,7 @@ Mark Abramowitz
|
|||
Markus Unterwaditzer
|
||||
Martijn Faassen
|
||||
Michael Aquilina
|
||||
Michael Birtwell
|
||||
Michael Droettboom
|
||||
Nicolas Delaby
|
||||
Pieter Mulder
|
||||
|
|
84
CHANGELOG
84
CHANGELOG
|
@ -14,9 +14,89 @@
|
|||
* New `-rp` and `-rP` reporting options give the summary and full output
|
||||
of passing tests, respectively. Thanks to David Vierra for the PR.
|
||||
|
||||
2.8.2.dev
|
||||
---------
|
||||
|
||||
2.8.5.dev0
|
||||
----------
|
||||
|
||||
- fix #1074: precompute junitxml chunks instead of storing the whole tree in objects
|
||||
Thanks Bruno Oliveira for the report and Ronny Pfannschmidt for the PR
|
||||
|
||||
- fix #1238: fix ``pytest.deprecated_call()`` receiving multiple arguments
|
||||
(Regression introduced in 2.8.4). Thanks Alex Gaynor for the report and
|
||||
Bruno Oliveira for the PR.
|
||||
|
||||
|
||||
2.8.4
|
||||
-----
|
||||
|
||||
- fix #1190: ``deprecated_call()`` now works when the deprecated
|
||||
function has been already called by another test in the same
|
||||
module. Thanks Mikhail Chernykh for the report and Bruno Oliveira for the
|
||||
PR.
|
||||
|
||||
- fix #1198: ``--pastebin`` option now works on Python 3. Thanks
|
||||
Mehdy Khoshnoody for the PR.
|
||||
|
||||
- fix #1219: ``--pastebin`` now works correctly when captured output contains
|
||||
non-ascii characters. Thanks Bruno Oliveira for the PR.
|
||||
|
||||
- fix #1204: another error when collecting with a nasty __getattr__().
|
||||
Thanks Florian Bruhin for the PR.
|
||||
|
||||
- fix the summary printed when no tests did run.
|
||||
Thanks Florian Bruhin for the PR.
|
||||
- fix #1185 - ensure MANIFEST.in exactly matches what should go to a sdist
|
||||
|
||||
- a number of documentation modernizations wrt good practices.
|
||||
Thanks Bruno Oliveira for the PR.
|
||||
|
||||
2.8.3
|
||||
-----
|
||||
|
||||
- fix #1169: add __name__ attribute to testcases in TestCaseFunction to
|
||||
support the @unittest.skip decorator on functions and methods.
|
||||
Thanks Lee Kamentsky for the PR.
|
||||
|
||||
- fix #1035: collecting tests if test module level obj has __getattr__().
|
||||
Thanks Suor for the report and Bruno Oliveira / Tom Viner for the PR.
|
||||
|
||||
- fix #331: don't collect tests if their failure cannot be reported correctly
|
||||
e.g. they are a callable instance of a class.
|
||||
|
||||
- fix #1133: fixed internal error when filtering tracebacks where one entry
|
||||
belongs to a file which is no longer available.
|
||||
Thanks Bruno Oliveira for the PR.
|
||||
|
||||
- enhancement made to highlight in red the name of the failing tests so
|
||||
they stand out in the output.
|
||||
Thanks Gabriel Reis for the PR.
|
||||
|
||||
- add more talks to the documentation
|
||||
- extend documentation on the --ignore cli option
|
||||
- use pytest-runner for setuptools integration
|
||||
- minor fixes for interaction with OS X El Capitan
|
||||
system integrity protection (thanks Florian)
|
||||
|
||||
|
||||
2.8.2
|
||||
-----
|
||||
|
||||
- fix #1085: proper handling of encoding errors when passing encoded byte
|
||||
strings to pytest.parametrize in Python 2.
|
||||
Thanks Themanwithoutaplan for the report and Bruno Oliveira for the PR.
|
||||
|
||||
- fix #1087: handling SystemError when passing empty byte strings to
|
||||
pytest.parametrize in Python 3.
|
||||
Thanks Paul Kehrer for the report and Bruno Oliveira for the PR.
|
||||
|
||||
- fix #995: fixed internal error when filtering tracebacks where one entry
|
||||
was generated by an exec() statement.
|
||||
Thanks Daniel Hahler, Ashley C Straw, Philippe Gauthier and Pavel Savchenko
|
||||
for contributing and Bruno Oliveira for the PR.
|
||||
|
||||
- fix #1100 and #1057: errors when using autouse fixtures and doctest modules.
|
||||
Thanks Sergey B Kirpichev and Vital Kudzelka for contributing and Bruno
|
||||
Oliveira for the PR.
|
||||
|
||||
2.8.1
|
||||
-----
|
||||
|
|
|
@ -118,7 +118,7 @@ pytest could always use more documentation. What exactly is needed?
|
|||
|
||||
* More complementary documentation. Have you perhaps found something unclear?
|
||||
* Documentation translations. We currently have only English.
|
||||
* Docstrings. There's never too much of them.
|
||||
* Docstrings. There can never be too many of them.
|
||||
* Blog posts, articles and such -- they're all very appreciated.
|
||||
|
||||
You can also edit documentation files directly in the Github web interface
|
||||
|
|
|
@ -8,7 +8,9 @@ Note: this assumes you have already registered on pypi.
|
|||
2. Check and finalize CHANGELOG
|
||||
|
||||
3. Write doc/en/announce/release-VERSION.txt and include
|
||||
it in doc/en/announce/index.txt
|
||||
it in doc/en/announce/index.txt::
|
||||
|
||||
git log 2.8.2..HEAD --format='%aN' | sort -u # lists the names of authors involved
|
||||
|
||||
4. Use devpi for uploading a release tarball to a staging area::
|
||||
|
||||
|
@ -40,6 +42,7 @@ Note: this assumes you have already registered on pypi.
|
|||
installed::
|
||||
|
||||
cd doc/en
|
||||
python plugins_index/plugins_index.py
|
||||
make html
|
||||
|
||||
Commit any changes before tagging the release.
|
||||
|
@ -54,25 +57,25 @@ Note: this assumes you have already registered on pypi.
|
|||
cd doc/en
|
||||
make install # or "installall" if you have LaTeX installed for PDF
|
||||
|
||||
This requires ssh-login permission on pytest.org because it uses
|
||||
rsync.
|
||||
Note that the ``install`` target of ``doc/en/Makefile`` defines where the
|
||||
rsync goes to, typically to the "latest" section of pytest.org.
|
||||
This requires ssh-login permission on pytest.org because it uses
|
||||
rsync.
|
||||
Note that the ``install`` target of ``doc/en/Makefile`` defines where the
|
||||
rsync goes to, typically to the "latest" section of pytest.org.
|
||||
|
||||
If you are making a minor release (e.g. 5.4), you also need to manually
|
||||
create a symlink for "latest"::
|
||||
If you are making a minor release (e.g. 5.4), you also need to manually
|
||||
create a symlink for "latest"::
|
||||
|
||||
ssh pytest-dev@pytest.org
|
||||
ln -s 5.4 latest
|
||||
ssh pytest-dev@pytest.org
|
||||
ln -s 5.4 latest
|
||||
|
||||
Browse to pytest.org to verify.
|
||||
Browse to pytest.org to verify.
|
||||
|
||||
11. Publish to pypi::
|
||||
|
||||
devpi push pytest-VERSION pypi:NAME
|
||||
|
||||
where NAME is the name of pypi.python.org as configured in your ``~/.pypirc``
|
||||
file `for devpi <http://doc.devpi.net/latest/quickstart-releaseprocess.html?highlight=pypirc#devpi-push-releasing-to-an-external-index>`_.
|
||||
where NAME is the name of pypi.python.org as configured in your ``~/.pypirc``
|
||||
file `for devpi <http://doc.devpi.net/latest/quickstart-releaseprocess.html?highlight=pypirc#devpi-push-releasing-to-an-external-index>`_.
|
||||
|
||||
|
||||
12. Send release announcement to mailing lists:
|
||||
|
@ -83,5 +86,7 @@ Note: this assumes you have already registered on pypi.
|
|||
|
||||
|
||||
13. **after the release** Bump the version number in ``_pytest/__init__.py``,
|
||||
to the next Minor release version (i.e. if you released ``pytest-2.8.0``,
|
||||
set it to ``pytest-2.9.0.dev1``).
|
||||
to the next Minor release version (i.e. if you released ``pytest-2.8.0``,
|
||||
set it to ``pytest-2.9.0.dev1``).
|
||||
|
||||
14. merge the actual release into the features branch and do a pull request against it
|
||||
|
|
|
@ -115,7 +115,7 @@ tags: feature
|
|||
|
||||
- introduce pytest.mark.nocollect for not considering a function for
|
||||
test collection at all. maybe also introduce a pytest.mark.test to
|
||||
explicitely mark a function to become a tested one. Lookup JUnit ways
|
||||
explicitly mark a function to become a tested one. Lookup JUnit ways
|
||||
of tagging tests.
|
||||
|
||||
introduce pytest.mark.importorskip
|
||||
|
|
35
MANIFEST.in
35
MANIFEST.in
|
@ -1,7 +1,34 @@
|
|||
include CHANGELOG
|
||||
include README.rst
|
||||
include setup.py
|
||||
include tox.ini
|
||||
include LICENSE
|
||||
graft doc
|
||||
include AUTHORS
|
||||
|
||||
include README.rst
|
||||
include CONTRIBUTING.rst
|
||||
|
||||
include tox.ini
|
||||
include setup.py
|
||||
|
||||
include .coveragerc
|
||||
|
||||
include plugin-test.sh
|
||||
include requirements-docs.txt
|
||||
include runtox.py
|
||||
|
||||
recursive-include bench *.py
|
||||
recursive-include extra *.py
|
||||
|
||||
graft testing
|
||||
graft doc
|
||||
|
||||
exclude _pytest/impl
|
||||
|
||||
graft _pytest/vendored_packages
|
||||
|
||||
recursive-exclude * *.pyc *.pyo
|
||||
|
||||
exclude appveyor/install.ps1
|
||||
exclude appveyor.yml
|
||||
exclude appveyor
|
||||
|
||||
exclude ISSUES.txt
|
||||
exclude HOWTORELEASE.rst
|
||||
|
|
|
@ -5,9 +5,9 @@ pytest
|
|||
The ``pytest`` testing tool makes it easy to write small tests, yet
|
||||
scales to support complex functional testing.
|
||||
|
||||
.. image:: http://img.shields.io/pypi/v/pytest.svg
|
||||
.. image:: https://img.shields.io/pypi/v/pytest.svg
|
||||
:target: https://pypi.python.org/pypi/pytest
|
||||
.. image:: http://img.shields.io/coveralls/pytest-dev/pytest/master.svg
|
||||
.. image:: https://img.shields.io/coveralls/pytest-dev/pytest/master.svg
|
||||
:target: https://coveralls.io/r/pytest-dev/pytest
|
||||
.. image:: https://travis-ci.org/pytest-dev/pytest.svg?branch=master
|
||||
:target: https://travis-ci.org/pytest-dev/pytest
|
||||
|
|
|
@ -128,7 +128,7 @@ class AssertionRewritingHook(object):
|
|||
# One of the path components was not a directory, likely
|
||||
# because we're in a zip file.
|
||||
write = False
|
||||
elif e in [errno.EACCES, errno.EROFS]:
|
||||
elif e in [errno.EACCES, errno.EROFS, errno.EPERM]:
|
||||
state.trace("read only directory: %r" % fn_pypath.dirname)
|
||||
write = False
|
||||
else:
|
||||
|
|
|
@ -192,8 +192,8 @@ def cache(request):
|
|||
cache.get(key, default)
|
||||
cache.set(key, value)
|
||||
|
||||
Keys must be strings not containing a "/" separator. Add a unique identifier
|
||||
(such as plugin/app name) to avoid clashes with other cache users.
|
||||
Keys must be a ``/`` separated value, where the first part is usually the
|
||||
name of your plugin or application to avoid clashes with other cache users.
|
||||
|
||||
Values can be any object handled by the json stdlib module.
|
||||
"""
|
||||
|
|
|
@ -5,6 +5,7 @@ import pytest, py
|
|||
from _pytest.python import FixtureRequest
|
||||
from py._code.code import TerminalRepr, ReprFileLocation
|
||||
|
||||
|
||||
def pytest_addoption(parser):
|
||||
parser.addini('doctest_optionflags', 'option flags for doctests',
|
||||
type="args", default=["ELLIPSIS"])
|
||||
|
@ -22,6 +23,7 @@ def pytest_addoption(parser):
|
|||
help="ignore doctest ImportErrors",
|
||||
dest="doctest_ignore_import_errors")
|
||||
|
||||
|
||||
def pytest_collect_file(path, parent):
|
||||
config = parent.config
|
||||
if path.ext == ".py":
|
||||
|
@ -31,20 +33,33 @@ def pytest_collect_file(path, parent):
|
|||
path.check(fnmatch=config.getvalue("doctestglob")):
|
||||
return DoctestTextfile(path, parent)
|
||||
|
||||
|
||||
class ReprFailDoctest(TerminalRepr):
|
||||
|
||||
def __init__(self, reprlocation, lines):
|
||||
self.reprlocation = reprlocation
|
||||
self.lines = lines
|
||||
|
||||
def toterminal(self, tw):
|
||||
for line in self.lines:
|
||||
tw.line(line)
|
||||
self.reprlocation.toterminal(tw)
|
||||
|
||||
|
||||
class DoctestItem(pytest.Item):
|
||||
|
||||
def __init__(self, name, parent, runner=None, dtest=None):
|
||||
super(DoctestItem, self).__init__(name, parent)
|
||||
self.runner = runner
|
||||
self.dtest = dtest
|
||||
self.obj = None
|
||||
self.fixture_request = None
|
||||
|
||||
def setup(self):
|
||||
if self.dtest is not None:
|
||||
self.fixture_request = _setup_fixtures(self)
|
||||
globs = dict(getfixture=self.fixture_request.getfuncargvalue)
|
||||
self.dtest.globs.update(globs)
|
||||
|
||||
def runtest(self):
|
||||
_check_all_skipped(self.dtest)
|
||||
|
@ -94,6 +109,7 @@ class DoctestItem(pytest.Item):
|
|||
def reportinfo(self):
|
||||
return self.fspath, None, "[doctest] %s" % self.name
|
||||
|
||||
|
||||
def _get_flag_lookup():
|
||||
import doctest
|
||||
return dict(DONT_ACCEPT_TRUE_FOR_1=doctest.DONT_ACCEPT_TRUE_FOR_1,
|
||||
|
@ -104,6 +120,7 @@ def _get_flag_lookup():
|
|||
COMPARISON_FLAGS=doctest.COMPARISON_FLAGS,
|
||||
ALLOW_UNICODE=_get_allow_unicode_flag())
|
||||
|
||||
|
||||
def get_optionflags(parent):
|
||||
optionflags_str = parent.config.getini("doctest_optionflags")
|
||||
flag_lookup_table = _get_flag_lookup()
|
||||
|
@ -113,7 +130,7 @@ def get_optionflags(parent):
|
|||
return flag_acc
|
||||
|
||||
|
||||
class DoctestTextfile(DoctestItem, pytest.File):
|
||||
class DoctestTextfile(DoctestItem, pytest.Module):
|
||||
|
||||
def runtest(self):
|
||||
import doctest
|
||||
|
@ -148,7 +165,7 @@ def _check_all_skipped(test):
|
|||
pytest.skip('all tests skipped by +SKIP option')
|
||||
|
||||
|
||||
class DoctestModule(pytest.File):
|
||||
class DoctestModule(pytest.Module):
|
||||
def collect(self):
|
||||
import doctest
|
||||
if self.fspath.basename == "conftest.py":
|
||||
|
@ -161,23 +178,19 @@ class DoctestModule(pytest.File):
|
|||
pytest.skip('unable to import module %r' % self.fspath)
|
||||
else:
|
||||
raise
|
||||
# satisfy `FixtureRequest` constructor...
|
||||
fixture_request = _setup_fixtures(self)
|
||||
doctest_globals = dict(getfixture=fixture_request.getfuncargvalue)
|
||||
# uses internal doctest module parsing mechanism
|
||||
finder = doctest.DocTestFinder()
|
||||
optionflags = get_optionflags(self)
|
||||
runner = doctest.DebugRunner(verbose=0, optionflags=optionflags,
|
||||
checker=_get_unicode_checker())
|
||||
for test in finder.find(module, module.__name__,
|
||||
extraglobs=doctest_globals):
|
||||
for test in finder.find(module, module.__name__):
|
||||
if test.examples: # skip empty doctests
|
||||
yield DoctestItem(test.name, self, runner, test)
|
||||
|
||||
|
||||
def _setup_fixtures(doctest_item):
|
||||
"""
|
||||
Used by DoctestTextfile and DoctestModule to setup fixture information.
|
||||
Used by DoctestTextfile and DoctestItem to setup fixture information.
|
||||
"""
|
||||
def func():
|
||||
pass
|
||||
|
|
|
@ -80,13 +80,23 @@ def showhelp(config):
|
|||
line = " %-24s %s" %(spec, help)
|
||||
tw.line(line[:tw.fullwidth])
|
||||
|
||||
tw.line() ; tw.line()
|
||||
#tw.sep("=")
|
||||
tw.line()
|
||||
tw.line("environment variables:")
|
||||
vars = [
|
||||
("PYTEST_ADDOPTS", "extra command line options"),
|
||||
("PYTEST_PLUGINS", "comma-separated plugins to load during startup"),
|
||||
("PYTEST_DEBUG", "set to enable debug tracing of pytest's internals")
|
||||
]
|
||||
for name, help in vars:
|
||||
tw.line(" %-24s %s" % (name, help))
|
||||
tw.line()
|
||||
tw.line()
|
||||
|
||||
tw.line("to see available markers type: py.test --markers")
|
||||
tw.line("to see available fixtures type: py.test --fixtures")
|
||||
tw.line("(shown according to specified file_or_dir or current dir "
|
||||
"if not specified)")
|
||||
tw.line(str(reporter.stats))
|
||||
|
||||
for warningreport in reporter.stats.get('warnings', []):
|
||||
tw.line("warning : " + warningreport.message, red=True)
|
||||
return
|
||||
|
|
|
@ -1,9 +1,13 @@
|
|||
""" report test results in JUnit-XML format, for use with Hudson and build integration servers.
|
||||
"""
|
||||
report test results in JUnit-XML format,
|
||||
for use with Jenkins and build integration servers.
|
||||
|
||||
Output conforms to https://github.com/jenkinsci/xunit-plugin/blob/master/src/main/resources/org/jenkinsci/plugins/xunit/types/model/xsd/junit-10.xsd
|
||||
|
||||
Based on initial code from Ross Lawley.
|
||||
"""
|
||||
# Output conforms to https://github.com/jenkinsci/xunit-plugin/blob/master/
|
||||
# src/main/resources/org/jenkinsci/plugins/xunit/types/model/xsd/junit-10.xsd
|
||||
|
||||
import py
|
||||
import os
|
||||
import re
|
||||
|
@ -19,10 +23,10 @@ else:
|
|||
unicode = str
|
||||
long = int
|
||||
|
||||
|
||||
class Junit(py.xml.Namespace):
|
||||
pass
|
||||
|
||||
|
||||
# We need to get the subset of the invalid unicode ranges according to
|
||||
# XML 1.0 which are valid in this python build. Hence we calculate
|
||||
# this dynamically instead of hardcoding it. The spec range of valid
|
||||
|
@ -30,21 +34,19 @@ class Junit(py.xml.Namespace):
|
|||
# | [#x10000-#x10FFFF]
|
||||
_legal_chars = (0x09, 0x0A, 0x0d)
|
||||
_legal_ranges = (
|
||||
(0x20, 0x7E),
|
||||
(0x80, 0xD7FF),
|
||||
(0xE000, 0xFFFD),
|
||||
(0x10000, 0x10FFFF),
|
||||
(0x20, 0x7E), (0x80, 0xD7FF), (0xE000, 0xFFFD), (0x10000, 0x10FFFF),
|
||||
)
|
||||
_legal_xml_re = [unicode("%s-%s") % (unichr(low), unichr(high))
|
||||
for (low, high) in _legal_ranges
|
||||
if low < sys.maxunicode]
|
||||
_legal_xml_re = [
|
||||
unicode("%s-%s") % (unichr(low), unichr(high))
|
||||
for (low, high) in _legal_ranges if low < sys.maxunicode
|
||||
]
|
||||
_legal_xml_re = [unichr(x) for x in _legal_chars] + _legal_xml_re
|
||||
illegal_xml_re = re.compile(unicode('[^%s]') %
|
||||
unicode('').join(_legal_xml_re))
|
||||
illegal_xml_re = re.compile(unicode('[^%s]') % unicode('').join(_legal_xml_re))
|
||||
del _legal_chars
|
||||
del _legal_ranges
|
||||
del _legal_xml_re
|
||||
|
||||
|
||||
def bin_xml_escape(arg):
|
||||
def repl(matchobj):
|
||||
i = ord(matchobj.group())
|
||||
|
@ -52,122 +54,93 @@ def bin_xml_escape(arg):
|
|||
return unicode('#x%02X') % i
|
||||
else:
|
||||
return unicode('#x%04X') % i
|
||||
|
||||
return py.xml.raw(illegal_xml_re.sub(repl, py.xml.escape(arg)))
|
||||
|
||||
@pytest.fixture
|
||||
def record_xml_property(request):
|
||||
"""Fixture that adds extra xml properties to the tag for the calling test.
|
||||
The fixture is callable with (name, value), with value being automatically
|
||||
xml-encoded.
|
||||
"""
|
||||
def inner(name, value):
|
||||
if hasattr(request.config, "_xml"):
|
||||
request.config._xml.add_custom_property(name, value)
|
||||
msg = 'record_xml_property is an experimental feature'
|
||||
request.config.warn(code='C3', message=msg,
|
||||
fslocation=request.node.location[:2])
|
||||
return inner
|
||||
|
||||
def pytest_addoption(parser):
|
||||
group = parser.getgroup("terminal reporting")
|
||||
group.addoption('--junitxml', '--junit-xml', action="store",
|
||||
dest="xmlpath", metavar="path", default=None,
|
||||
help="create junit-xml style report file at given path.")
|
||||
group.addoption('--junitprefix', '--junit-prefix', action="store",
|
||||
metavar="str", default=None,
|
||||
help="prepend prefix to classnames in junit-xml output")
|
||||
class _NodeReporter(object):
|
||||
def __init__(self, nodeid, xml):
|
||||
|
||||
def pytest_configure(config):
|
||||
xmlpath = config.option.xmlpath
|
||||
# prevent opening xmllog on slave nodes (xdist)
|
||||
if xmlpath and not hasattr(config, 'slaveinput'):
|
||||
config._xml = LogXML(xmlpath, config.option.junitprefix)
|
||||
config.pluginmanager.register(config._xml)
|
||||
self.id = nodeid
|
||||
self.xml = xml
|
||||
self.add_stats = self.xml.add_stats
|
||||
self.duration = 0
|
||||
self.properties = {}
|
||||
self.property_insert_order = []
|
||||
self.nodes = []
|
||||
self.testcase = None
|
||||
self.attrs = {}
|
||||
|
||||
def pytest_unconfigure(config):
|
||||
xml = getattr(config, '_xml', None)
|
||||
if xml:
|
||||
del config._xml
|
||||
config.pluginmanager.unregister(xml)
|
||||
|
||||
def mangle_testnames(names):
|
||||
names = [x.replace(".py", "") for x in names if x != '()']
|
||||
names[0] = names[0].replace("/", '.')
|
||||
return names
|
||||
def append(self, node):
|
||||
self.xml.add_stats(type(node).__name__)
|
||||
self.nodes.append(node)
|
||||
|
||||
class LogXML(object):
|
||||
def __init__(self, logfile, prefix):
|
||||
logfile = os.path.expanduser(os.path.expandvars(logfile))
|
||||
self.logfile = os.path.normpath(os.path.abspath(logfile))
|
||||
self.prefix = prefix
|
||||
self.tests = []
|
||||
self.tests_by_nodeid = {} # nodeid -> Junit.testcase
|
||||
self.durations = {} # nodeid -> total duration (setup+call+teardown)
|
||||
self.passed = self.skipped = 0
|
||||
self.failed = self.errors = 0
|
||||
self.custom_properties = {}
|
||||
def add_property(self, name, value):
|
||||
name = str(name)
|
||||
if name not in self.property_insert_order:
|
||||
self.property_insert_order.append(name)
|
||||
self.properties[name] = bin_xml_escape(value)
|
||||
|
||||
def add_custom_property(self, name, value):
|
||||
self.custom_properties[str(name)] = bin_xml_escape(str(value))
|
||||
|
||||
def _opentestcase(self, report):
|
||||
names = mangle_testnames(report.nodeid.split("::"))
|
||||
def make_properties_node(self):
|
||||
"""Return a Junit node containing custom properties, if any.
|
||||
"""
|
||||
if self.properties:
|
||||
return Junit.properties([
|
||||
Junit.property(name=name, value=self.properties[name])
|
||||
for name in self.property_insert_order
|
||||
])
|
||||
return ''
|
||||
|
||||
|
||||
def record_testreport(self, testreport):
|
||||
assert not self.testcase
|
||||
names = mangle_testnames(testreport.nodeid.split("::"))
|
||||
classnames = names[:-1]
|
||||
if self.prefix:
|
||||
classnames.insert(0, self.prefix)
|
||||
if self.xml.prefix:
|
||||
classnames.insert(0, self.xml.prefix)
|
||||
attrs = {
|
||||
"classname": ".".join(classnames),
|
||||
"name": bin_xml_escape(names[-1]),
|
||||
"file": report.location[0],
|
||||
"time": self.durations.get(report.nodeid, 0),
|
||||
"file": testreport.location[0],
|
||||
}
|
||||
if report.location[1] is not None:
|
||||
attrs["line"] = report.location[1]
|
||||
testcase = Junit.testcase(**attrs)
|
||||
custom_properties = self.pop_custom_properties()
|
||||
if custom_properties:
|
||||
testcase.append(custom_properties)
|
||||
self.tests.append(testcase)
|
||||
self.tests_by_nodeid[report.nodeid] = testcase
|
||||
if testreport.location[1] is not None:
|
||||
attrs["line"] = testreport.location[1]
|
||||
self.attrs = attrs
|
||||
|
||||
def to_xml(self):
|
||||
testcase = Junit.testcase(time=self.duration, **self.attrs)
|
||||
testcase.append(self.make_properties_node())
|
||||
for node in self.nodes:
|
||||
testcase.append(node)
|
||||
return testcase
|
||||
|
||||
def _add_simple(self, kind, message, data=None):
|
||||
data = bin_xml_escape(data)
|
||||
node = kind(data, message=message)
|
||||
self.append(node)
|
||||
|
||||
def _write_captured_output(self, report):
|
||||
for capname in ('out', 'err'):
|
||||
allcontent = ""
|
||||
for name, content in report.get_sections("Captured std%s" %
|
||||
capname):
|
||||
capname):
|
||||
allcontent += content
|
||||
if allcontent:
|
||||
tag = getattr(Junit, 'system-'+capname)
|
||||
tag = getattr(Junit, 'system-' + capname)
|
||||
self.append(tag(bin_xml_escape(allcontent)))
|
||||
|
||||
def append(self, obj):
|
||||
self.tests[-1].append(obj)
|
||||
|
||||
def pop_custom_properties(self):
|
||||
"""Return a Junit node containing custom properties set for
|
||||
the current test, if any, and reset the current custom properties.
|
||||
"""
|
||||
if self.custom_properties:
|
||||
result = Junit.properties(
|
||||
[
|
||||
Junit.property(name=name, value=value)
|
||||
for name, value in self.custom_properties.items()
|
||||
]
|
||||
)
|
||||
self.custom_properties.clear()
|
||||
return result
|
||||
return None
|
||||
|
||||
def append_pass(self, report):
|
||||
self.passed += 1
|
||||
self.add_stats('passed')
|
||||
self._write_captured_output(report)
|
||||
|
||||
def append_failure(self, report):
|
||||
#msg = str(report.longrepr.reprtraceback.extraline)
|
||||
# msg = str(report.longrepr.reprtraceback.extraline)
|
||||
if hasattr(report, "wasxfail"):
|
||||
self.append(
|
||||
Junit.skipped(message="xfail-marked test passes unexpectedly"))
|
||||
self.skipped += 1
|
||||
self._add_simple(
|
||||
Junit.skipped,
|
||||
"xfail-marked test passes unexpectedly")
|
||||
else:
|
||||
if hasattr(report.longrepr, "reprcrash"):
|
||||
message = report.longrepr.reprcrash.message
|
||||
|
@ -179,30 +152,26 @@ class LogXML(object):
|
|||
fail = Junit.failure(message=message)
|
||||
fail.append(bin_xml_escape(report.longrepr))
|
||||
self.append(fail)
|
||||
self.failed += 1
|
||||
self._write_captured_output(report)
|
||||
|
||||
def append_collect_error(self, report):
|
||||
#msg = str(report.longrepr.reprtraceback.extraline)
|
||||
# msg = str(report.longrepr.reprtraceback.extraline)
|
||||
self.append(Junit.error(bin_xml_escape(report.longrepr),
|
||||
message="collection failure"))
|
||||
self.errors += 1
|
||||
|
||||
def append_collect_skipped(self, report):
|
||||
#msg = str(report.longrepr.reprtraceback.extraline)
|
||||
self.append(Junit.skipped(bin_xml_escape(report.longrepr),
|
||||
message="collection skipped"))
|
||||
self.skipped += 1
|
||||
self._add_simple(
|
||||
Junit.skipped, "collection skipped", report.longrepr)
|
||||
|
||||
def append_error(self, report):
|
||||
self.append(Junit.error(bin_xml_escape(report.longrepr),
|
||||
message="test setup failure"))
|
||||
self.errors += 1
|
||||
self._add_simple(
|
||||
Junit.error, "test setup failure", report.longrepr)
|
||||
|
||||
def append_skipped(self, report):
|
||||
if hasattr(report, "wasxfail"):
|
||||
self.append(Junit.skipped(bin_xml_escape(report.wasxfail),
|
||||
message="expected test failure"))
|
||||
self._add_simple(
|
||||
Junit.skipped, "expected test failure", report.wasxfail
|
||||
)
|
||||
else:
|
||||
filename, lineno, skipreason = report.longrepr
|
||||
if skipreason.startswith("Skipped: "):
|
||||
|
@ -210,11 +179,113 @@ class LogXML(object):
|
|||
self.append(
|
||||
Junit.skipped("%s:%s: %s" % (filename, lineno, skipreason),
|
||||
type="pytest.skip",
|
||||
message=skipreason
|
||||
))
|
||||
self.skipped += 1
|
||||
message=skipreason))
|
||||
self._write_captured_output(report)
|
||||
|
||||
|
||||
def finalize(self):
|
||||
data = self.to_xml().unicode(indent=0)
|
||||
self.__dict__.clear()
|
||||
self.to_xml = lambda: py.xml.raw(data)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def record_xml_property(request):
|
||||
"""Fixture that adds extra xml properties to the tag for the calling test.
|
||||
The fixture is callable with (name, value), with value being automatically
|
||||
xml-encoded.
|
||||
"""
|
||||
request.node.warn(
|
||||
code='C3',
|
||||
message='record_xml_property is an experimental feature',
|
||||
)
|
||||
xml = getattr(request.config, "_xml", None)
|
||||
if xml is not None:
|
||||
node_reporter = xml.node_reporter(request.node.nodeid)
|
||||
return node_reporter.add_property
|
||||
else:
|
||||
def add_property_noop(name, value):
|
||||
pass
|
||||
|
||||
return add_property_noop
|
||||
|
||||
|
||||
def pytest_addoption(parser):
|
||||
group = parser.getgroup("terminal reporting")
|
||||
group.addoption(
|
||||
'--junitxml', '--junit-xml',
|
||||
action="store",
|
||||
dest="xmlpath",
|
||||
metavar="path",
|
||||
default=None,
|
||||
help="create junit-xml style report file at given path.")
|
||||
group.addoption(
|
||||
'--junitprefix', '--junit-prefix',
|
||||
action="store",
|
||||
metavar="str",
|
||||
default=None,
|
||||
help="prepend prefix to classnames in junit-xml output")
|
||||
|
||||
|
||||
def pytest_configure(config):
|
||||
xmlpath = config.option.xmlpath
|
||||
# prevent opening xmllog on slave nodes (xdist)
|
||||
if xmlpath and not hasattr(config, 'slaveinput'):
|
||||
config._xml = LogXML(xmlpath, config.option.junitprefix)
|
||||
config.pluginmanager.register(config._xml)
|
||||
|
||||
|
||||
def pytest_unconfigure(config):
|
||||
xml = getattr(config, '_xml', None)
|
||||
if xml:
|
||||
del config._xml
|
||||
config.pluginmanager.unregister(xml)
|
||||
|
||||
|
||||
def mangle_testnames(names):
|
||||
names = [x.replace(".py", "") for x in names if x != '()']
|
||||
names[0] = names[0].replace("/", '.')
|
||||
return names
|
||||
|
||||
|
||||
class LogXML(object):
|
||||
def __init__(self, logfile, prefix):
|
||||
logfile = os.path.expanduser(os.path.expandvars(logfile))
|
||||
self.logfile = os.path.normpath(os.path.abspath(logfile))
|
||||
self.prefix = prefix
|
||||
self.stats = dict.fromkeys([
|
||||
'error',
|
||||
'passed',
|
||||
'failure',
|
||||
'skipped',
|
||||
], 0)
|
||||
self.node_reporters = {} # nodeid -> _NodeReporter
|
||||
self.node_reporters_ordered = []
|
||||
|
||||
def node_reporter(self, report):
|
||||
nodeid = getattr(report, 'nodeid', report)
|
||||
# local hack to handle xdist report order
|
||||
slavenode = getattr(report, 'node', None)
|
||||
|
||||
key = nodeid, slavenode
|
||||
|
||||
if key in self.node_reporters:
|
||||
#TODO: breasks for --dist=each
|
||||
return self.node_reporters[key]
|
||||
reporter = _NodeReporter(nodeid, self)
|
||||
self.node_reporters[key] = reporter
|
||||
self.node_reporters_ordered.append(reporter)
|
||||
return reporter
|
||||
|
||||
def add_stats(self, key):
|
||||
if key in self.stats:
|
||||
self.stats[key] += 1
|
||||
|
||||
def _opentestcase(self, report):
|
||||
reporter = self.node_reporter(report)
|
||||
reporter.record_testreport(report)
|
||||
return reporter
|
||||
|
||||
def pytest_runtest_logreport(self, report):
|
||||
"""handle a setup/call/teardown report, generating the appropriate
|
||||
xml tags as necessary.
|
||||
|
@ -240,47 +311,40 @@ class LogXML(object):
|
|||
"""
|
||||
if report.passed:
|
||||
if report.when == "call": # ignore setup/teardown
|
||||
self._opentestcase(report)
|
||||
self.append_pass(report)
|
||||
reporter = self._opentestcase(report)
|
||||
reporter.append_pass(report)
|
||||
elif report.failed:
|
||||
self._opentestcase(report)
|
||||
if report.when != "call":
|
||||
self.append_error(report)
|
||||
reporter = self._opentestcase(report)
|
||||
if report.when == "call":
|
||||
reporter.append_failure(report)
|
||||
else:
|
||||
self.append_failure(report)
|
||||
reporter.append_error(report)
|
||||
elif report.skipped:
|
||||
self._opentestcase(report)
|
||||
self.append_skipped(report)
|
||||
reporter = self._opentestcase(report)
|
||||
reporter.append_skipped(report)
|
||||
self.update_testcase_duration(report)
|
||||
if report.when == "teardown":
|
||||
self.node_reporter(report).finalize()
|
||||
|
||||
def update_testcase_duration(self, report):
|
||||
"""accumulates total duration for nodeid from given report and updates
|
||||
the Junit.testcase with the new total if already created.
|
||||
"""
|
||||
total = self.durations.get(report.nodeid, 0.0)
|
||||
total += getattr(report, 'duration', 0.0)
|
||||
self.durations[report.nodeid] = total
|
||||
|
||||
testcase = self.tests_by_nodeid.get(report.nodeid)
|
||||
if testcase is not None:
|
||||
testcase.attr.time = total
|
||||
reporter = self.node_reporter(report)
|
||||
reporter.duration += getattr(report, 'duration', 0.0)
|
||||
|
||||
def pytest_collectreport(self, report):
|
||||
if not report.passed:
|
||||
self._opentestcase(report)
|
||||
reporter = self._opentestcase(report)
|
||||
if report.failed:
|
||||
self.append_collect_error(report)
|
||||
reporter.append_collect_error(report)
|
||||
else:
|
||||
self.append_collect_skipped(report)
|
||||
reporter.append_collect_skipped(report)
|
||||
|
||||
def pytest_internalerror(self, excrepr):
|
||||
self.errors += 1
|
||||
data = bin_xml_escape(excrepr)
|
||||
self.tests.append(
|
||||
Junit.testcase(
|
||||
Junit.error(data, message="internal error"),
|
||||
classname="pytest",
|
||||
name="internal"))
|
||||
reporter = self.node_reporter('internal')
|
||||
reporter.attrs.update(classname="pytest", name='internal')
|
||||
reporter._add_simple(Junit.error, 'internal error', excrepr)
|
||||
|
||||
def pytest_sessionstart(self):
|
||||
self.suite_start_time = time.time()
|
||||
|
@ -292,19 +356,20 @@ class LogXML(object):
|
|||
logfile = open(self.logfile, 'w', encoding='utf-8')
|
||||
suite_stop_time = time.time()
|
||||
suite_time_delta = suite_stop_time - self.suite_start_time
|
||||
numtests = self.passed + self.failed
|
||||
|
||||
numtests = self.stats['passed'] + self.stats['failure']
|
||||
|
||||
logfile.write('<?xml version="1.0" encoding="utf-8"?>')
|
||||
logfile.write(Junit.testsuite(
|
||||
self.tests,
|
||||
[x.to_xml() for x in self.node_reporters_ordered],
|
||||
name="pytest",
|
||||
errors=self.errors,
|
||||
failures=self.failed,
|
||||
skips=self.skipped,
|
||||
errors=self.stats['error'],
|
||||
failures=self.stats['failure'],
|
||||
skips=self.stats['skipped'],
|
||||
tests=numtests,
|
||||
time="%.3f" % suite_time_delta,
|
||||
).unicode(indent=0))
|
||||
time="%.3f" % suite_time_delta, ).unicode(indent=0))
|
||||
logfile.close()
|
||||
|
||||
def pytest_terminal_summary(self, terminalreporter):
|
||||
terminalreporter.write_sep("-", "generated xml file: %s" % (self.logfile))
|
||||
terminalreporter.write_sep("-",
|
||||
"generated xml file: %s" % (self.logfile))
|
||||
|
|
|
@ -240,17 +240,11 @@ class Node(object):
|
|||
# used for storing artificial fixturedefs for direct parametrization
|
||||
self._name2pseudofixturedef = {}
|
||||
|
||||
#self.extrainit()
|
||||
|
||||
@property
|
||||
def ihook(self):
|
||||
""" fspath sensitive hook proxy used to call pytest hooks"""
|
||||
return self.session.gethookproxy(self.fspath)
|
||||
|
||||
#def extrainit(self):
|
||||
# """"extra initialization after Node is initialized. Implemented
|
||||
# by some subclasses. """
|
||||
|
||||
Module = compatproperty("Module")
|
||||
Class = compatproperty("Class")
|
||||
Instance = compatproperty("Instance")
|
||||
|
|
|
@ -13,17 +13,21 @@ def pytest_addoption(parser):
|
|||
|
||||
@pytest.hookimpl(trylast=True)
|
||||
def pytest_configure(config):
|
||||
import py
|
||||
if config.option.pastebin == "all":
|
||||
tr = config.pluginmanager.getplugin('terminalreporter')
|
||||
# if no terminal reporter plugin is present, nothing we can do here;
|
||||
# this can happen when this function executes in a slave node
|
||||
# when using pytest-xdist, for example
|
||||
if tr is not None:
|
||||
config._pastebinfile = tempfile.TemporaryFile('w+')
|
||||
# pastebin file will be utf-8 encoded binary file
|
||||
config._pastebinfile = tempfile.TemporaryFile('w+b')
|
||||
oldwrite = tr._tw.write
|
||||
def tee_write(s, **kwargs):
|
||||
oldwrite(s, **kwargs)
|
||||
config._pastebinfile.write(str(s))
|
||||
if py.builtin._istext(s):
|
||||
s = s.encode('utf-8')
|
||||
config._pastebinfile.write(s)
|
||||
tr._tw.write = tee_write
|
||||
|
||||
def pytest_unconfigure(config):
|
||||
|
@ -45,7 +49,7 @@ def create_new_paste(contents):
|
|||
"""
|
||||
Creates a new paste using bpaste.net service.
|
||||
|
||||
:contents: paste contents
|
||||
:contents: paste contents as utf-8 encoded bytes
|
||||
:returns: url to the pasted contents
|
||||
"""
|
||||
import re
|
||||
|
@ -61,8 +65,8 @@ def create_new_paste(contents):
|
|||
'expiry': '1week',
|
||||
}
|
||||
url = 'https://bpaste.net'
|
||||
response = urlopen(url, data=urlencode(params)).read()
|
||||
m = re.search(r'href="/raw/(\w+)"', response)
|
||||
response = urlopen(url, data=urlencode(params).encode('ascii')).read()
|
||||
m = re.search(r'href="/raw/(\w+)"', response.decode('utf-8'))
|
||||
if m:
|
||||
return '%s/show/%s' % (url, m.group(1))
|
||||
else:
|
||||
|
|
|
@ -4,6 +4,7 @@ import fnmatch
|
|||
import functools
|
||||
import py
|
||||
import inspect
|
||||
import types
|
||||
import sys
|
||||
import pytest
|
||||
from _pytest.mark import MarkDecorator, MarkerError
|
||||
|
@ -43,13 +44,31 @@ else:
|
|||
def _format_args(func):
|
||||
return inspect.formatargspec(*inspect.getargspec(func))
|
||||
|
||||
if sys.version_info[:2] == (2, 6):
|
||||
def isclass(object):
|
||||
""" Return true if the object is a class. Overrides inspect.isclass for
|
||||
python 2.6 because it will return True for objects which always return
|
||||
something on __getattr__ calls (see #1035).
|
||||
Backport of https://hg.python.org/cpython/rev/35bf8f7a8edc
|
||||
"""
|
||||
return isinstance(object, (type, types.ClassType))
|
||||
|
||||
def _has_positional_arg(func):
|
||||
return func.__code__.co_argcount
|
||||
|
||||
|
||||
def filter_traceback(entry):
|
||||
return entry.path != cutdir1 and not entry.path.relto(cutdir2)
|
||||
# entry.path might sometimes return a str object when the entry
|
||||
# points to dynamically generated code
|
||||
# see https://bitbucket.org/pytest-dev/py/issues/71
|
||||
raw_filename = entry.frame.code.raw.co_filename
|
||||
is_generated = '<' in raw_filename and '>' in raw_filename
|
||||
if is_generated:
|
||||
return False
|
||||
# entry.path might point to an inexisting file, in which case it will
|
||||
# alsso return a str object. see #1133
|
||||
p = py.path.local(entry.path)
|
||||
return p != cutdir1 and not p.relto(cutdir2)
|
||||
|
||||
|
||||
def get_real_func(obj):
|
||||
|
@ -296,11 +315,14 @@ def pytest_pycollect_makeitem(collector, name, obj):
|
|||
elif collector.istestfunction(obj, name):
|
||||
# mock seems to store unbound methods (issue473), normalize it
|
||||
obj = getattr(obj, "__func__", obj)
|
||||
if not isfunction(obj):
|
||||
# We need to try and unwrap the function if it's a functools.partial
|
||||
# or a funtools.wrapped.
|
||||
# We musn't if it's been wrapped with mock.patch (python 2 only)
|
||||
if not (isfunction(obj) or isfunction(get_real_func(obj))):
|
||||
collector.warn(code="C2", message=
|
||||
"cannot collect %r because it is not a function."
|
||||
% name, )
|
||||
if getattr(obj, "__test__", True):
|
||||
elif getattr(obj, "__test__", True):
|
||||
if is_generator(obj):
|
||||
res = Generator(name, parent=collector)
|
||||
else:
|
||||
|
@ -362,12 +384,13 @@ class PyobjMixin(PyobjContext):
|
|||
def reportinfo(self):
|
||||
# XXX caching?
|
||||
obj = self.obj
|
||||
if hasattr(obj, 'compat_co_firstlineno'):
|
||||
compat_co_firstlineno = getattr(obj, 'compat_co_firstlineno', None)
|
||||
if isinstance(compat_co_firstlineno, int):
|
||||
# nose compatibility
|
||||
fspath = sys.modules[obj.__module__].__file__
|
||||
if fspath.endswith(".pyc"):
|
||||
fspath = fspath[:-1]
|
||||
lineno = obj.compat_co_firstlineno
|
||||
lineno = compat_co_firstlineno
|
||||
else:
|
||||
fspath, lineno = getfslineno(obj)
|
||||
modpath = self.getmodpath()
|
||||
|
@ -383,7 +406,10 @@ class PyCollector(PyobjMixin, pytest.Collector):
|
|||
""" Look for the __test__ attribute, which is applied by the
|
||||
@nose.tools.istest decorator
|
||||
"""
|
||||
return safe_getattr(obj, '__test__', False)
|
||||
# We explicitly check for "is True" here to not mistakenly treat
|
||||
# classes with a custom __getattr__ returning something truthy (like a
|
||||
# function) as test classes.
|
||||
return safe_getattr(obj, '__test__', False) is True
|
||||
|
||||
def classnamefilter(self, name):
|
||||
return self._matches_prefix_or_glob_option('python_classes', name)
|
||||
|
@ -1041,6 +1067,8 @@ class Metafunc(FuncargnamesCompatAttr):
|
|||
|
||||
|
||||
if _PY3:
|
||||
import codecs
|
||||
|
||||
def _escape_bytes(val):
|
||||
"""
|
||||
If val is pure ascii, returns it as a str(), otherwise escapes
|
||||
|
@ -1053,18 +1081,21 @@ if _PY3:
|
|||
want to return escaped bytes for any byte, even if they match
|
||||
a utf-8 string.
|
||||
"""
|
||||
# source: http://goo.gl/bGsnwC
|
||||
import codecs
|
||||
encoded_bytes, _ = codecs.escape_encode(val)
|
||||
return encoded_bytes.decode('ascii')
|
||||
if val:
|
||||
# source: http://goo.gl/bGsnwC
|
||||
encoded_bytes, _ = codecs.escape_encode(val)
|
||||
return encoded_bytes.decode('ascii')
|
||||
else:
|
||||
# empty bytes crashes codecs.escape_encode (#1087)
|
||||
return ''
|
||||
else:
|
||||
def _escape_bytes(val):
|
||||
"""
|
||||
In py2 bytes and str are the same, so return it unchanged if it
|
||||
In py2 bytes and str are the same type, so return it unchanged if it
|
||||
is a full ascii string, otherwise escape it into its binary form.
|
||||
"""
|
||||
try:
|
||||
return val.encode('ascii')
|
||||
return val.decode('ascii')
|
||||
except UnicodeDecodeError:
|
||||
return val.encode('string-escape')
|
||||
|
||||
|
@ -1093,7 +1124,7 @@ def _idval(val, argname, idx, idfn):
|
|||
# convertible to ascii, return it as an str() object instead
|
||||
try:
|
||||
return str(val)
|
||||
except UnicodeDecodeError:
|
||||
except UnicodeError:
|
||||
# fallthrough
|
||||
pass
|
||||
return str(argname)+str(idx)
|
||||
|
@ -1182,6 +1213,28 @@ def raises(expected_exception, *args, **kwargs):
|
|||
>>> with raises(ZeroDivisionError):
|
||||
... 1/0
|
||||
|
||||
.. note::
|
||||
|
||||
When using ``pytest.raises`` as a context manager, it's worthwhile to
|
||||
note that normal context manager rules apply and that the exception
|
||||
raised *must* be the final line in the scope of the context manager.
|
||||
Lines of code after that, within the scope of the context manager will
|
||||
not be executed. For example::
|
||||
|
||||
>>> with raises(OSError) as err:
|
||||
assert 1 == 1 # this will execute as expected
|
||||
raise OSError(errno.EEXISTS, 'directory exists')
|
||||
assert err.errno == errno.EEXISTS # this will not execute
|
||||
|
||||
Instead, the following approach must be taken (note the difference in
|
||||
scope)::
|
||||
|
||||
>>> with raises(OSError) as err:
|
||||
assert 1 == 1 # this will execute as expected
|
||||
raise OSError(errno.EEXISTS, 'directory exists')
|
||||
|
||||
assert err.errno == errno.EEXISTS # this will now execute
|
||||
|
||||
Or you can specify a callable by passing a to-be-called lambda::
|
||||
|
||||
>>> raises(ZeroDivisionError, lambda: 1/0)
|
||||
|
@ -2103,7 +2156,7 @@ def num_mock_patch_args(function):
|
|||
|
||||
def getfuncargnames(function, startindex=None):
|
||||
# XXX merge with main.py's varnames
|
||||
#assert not inspect.isclass(function)
|
||||
#assert not isclass(function)
|
||||
realfunction = function
|
||||
while hasattr(realfunction, "__wrapped__"):
|
||||
realfunction = realfunction.__wrapped__
|
||||
|
|
|
@ -29,26 +29,47 @@ def pytest_namespace():
|
|||
|
||||
|
||||
def deprecated_call(func=None, *args, **kwargs):
|
||||
"""Assert that ``func(*args, **kwargs)`` triggers a DeprecationWarning.
|
||||
""" assert that calling ``func(*args, **kwargs)`` triggers a
|
||||
``DeprecationWarning`` or ``PendingDeprecationWarning``.
|
||||
|
||||
This function can be used as a context manager::
|
||||
|
||||
>>> with deprecated_call():
|
||||
... myobject.deprecated_method()
|
||||
|
||||
Note: we cannot use WarningsRecorder here because it is still subject
|
||||
to the mechanism that prevents warnings of the same type from being
|
||||
triggered twice for the same module. See #1190.
|
||||
"""
|
||||
if not func:
|
||||
return WarningsChecker(expected_warning=DeprecationWarning)
|
||||
|
||||
wrec = WarningsRecorder()
|
||||
with wrec:
|
||||
warnings.simplefilter('always') # ensure all warnings are triggered
|
||||
ret = func(*args, **kwargs)
|
||||
categories = []
|
||||
|
||||
depwarnings = (DeprecationWarning, PendingDeprecationWarning)
|
||||
if not any(r.category in depwarnings for r in wrec):
|
||||
def warn_explicit(message, category, *args, **kwargs):
|
||||
categories.append(category)
|
||||
old_warn_explicit(message, category, *args, **kwargs)
|
||||
|
||||
def warn(message, category=None, *args, **kwargs):
|
||||
if isinstance(message, Warning):
|
||||
categories.append(message.__class__)
|
||||
else:
|
||||
categories.append(category)
|
||||
old_warn(message, category, *args, **kwargs)
|
||||
|
||||
old_warn = warnings.warn
|
||||
old_warn_explicit = warnings.warn_explicit
|
||||
warnings.warn_explicit = warn_explicit
|
||||
warnings.warn = warn
|
||||
try:
|
||||
ret = func(*args, **kwargs)
|
||||
finally:
|
||||
warnings.warn_explicit = old_warn_explicit
|
||||
warnings.warn = old_warn
|
||||
deprecation_categories = (DeprecationWarning, PendingDeprecationWarning)
|
||||
if not any(issubclass(c, deprecation_categories) for c in categories):
|
||||
__tracebackhide__ = True
|
||||
raise AssertionError("%r did not produce DeprecationWarning" % (func,))
|
||||
|
||||
return ret
|
||||
|
||||
|
||||
|
|
|
@ -473,7 +473,7 @@ def skip(msg=""):
|
|||
skip.Exception = Skipped
|
||||
|
||||
def fail(msg="", pytrace=True):
|
||||
""" explicitely fail an currently-executing test with the given Message.
|
||||
""" explicitly fail an currently-executing test with the given Message.
|
||||
|
||||
:arg pytrace: if false the msg represents the full failure information
|
||||
and no python traceback will be reported.
|
||||
|
|
|
@ -473,7 +473,8 @@ class TerminalReporter:
|
|||
self.write_line(line)
|
||||
else:
|
||||
msg = self._getfailureheadline(rep)
|
||||
self.write_sep("_", msg)
|
||||
markup = {'red': True, 'bold': True}
|
||||
self.write_sep("_", msg, **markup)
|
||||
self._outrep_summary(rep)
|
||||
|
||||
def summary_errors(self):
|
||||
|
@ -558,7 +559,11 @@ def build_summary_stats_line(stats):
|
|||
if val:
|
||||
key_name = key_translation.get(key, key)
|
||||
parts.append("%d %s" % (len(val), key_name))
|
||||
line = ", ".join(parts)
|
||||
|
||||
if parts:
|
||||
line = ", ".join(parts)
|
||||
else:
|
||||
line = "no tests ran"
|
||||
|
||||
if 'failed' in stats or 'error' in stats:
|
||||
color = 'red'
|
||||
|
|
|
@ -69,12 +69,26 @@ class TestCaseFunction(pytest.Function):
|
|||
|
||||
def setup(self):
|
||||
self._testcase = self.parent.obj(self.name)
|
||||
self._fix_unittest_skip_decorator()
|
||||
self._obj = getattr(self._testcase, self.name)
|
||||
if hasattr(self._testcase, 'setup_method'):
|
||||
self._testcase.setup_method(self._obj)
|
||||
if hasattr(self, "_request"):
|
||||
self._request._fillfixtures()
|
||||
|
||||
def _fix_unittest_skip_decorator(self):
|
||||
"""
|
||||
The @unittest.skip decorator calls functools.wraps(self._testcase)
|
||||
The call to functools.wraps() fails unless self._testcase
|
||||
has a __name__ attribute. This is usually automatically supplied
|
||||
if the test is a function or method, but we need to add manually
|
||||
here.
|
||||
|
||||
See issue #1169
|
||||
"""
|
||||
if sys.version_info[0] == 2:
|
||||
setattr(self._testcase, "__name__", self.name)
|
||||
|
||||
def teardown(self):
|
||||
if hasattr(self._testcase, 'teardown_method'):
|
||||
self._testcase.teardown_method(self._obj)
|
||||
|
|
|
@ -573,7 +573,7 @@ class _MultiCall:
|
|||
|
||||
# XXX note that the __multicall__ argument is supported only
|
||||
# for pytest compatibility reasons. It was never officially
|
||||
# supported there and is explicitely deprecated since 2.8
|
||||
# supported there and is explicitly deprecated since 2.8
|
||||
# so we can remove it soon, allowing to avoid the below recursion
|
||||
# in execute() and simplify/speed up the execute loop.
|
||||
|
||||
|
|
|
@ -4,7 +4,7 @@
|
|||
<li><a href="{{ pathto('contributing') }}">Contribution Guide</a></li>
|
||||
<li><a href="https://pypi.python.org/pypi/pytest">pytest @ PyPI</a></li>
|
||||
<li><a href="https://github.com/pytest-dev/pytest/">pytest @ GitHub</a></li>
|
||||
<li><a href="http://pytest.org/latest/plugins_index/index.html">3rd party plugins</a></li>
|
||||
<li><a href="http://plugincompat.herokuapp.com/">3rd party plugins</a></li>
|
||||
<li><a href="https://github.com/pytest-dev/pytest/issues">Issue Tracker</a></li>
|
||||
<li><a href="http://pytest.org/latest/pytest.pdf">PDF Documentation</a>
|
||||
</ul>
|
||||
|
|
|
@ -5,6 +5,10 @@ Release announcements
|
|||
.. toctree::
|
||||
:maxdepth: 2
|
||||
|
||||
|
||||
release-2.8.4
|
||||
release-2.8.3
|
||||
release-2.8.2
|
||||
release-2.7.2
|
||||
release-2.7.1
|
||||
release-2.7.0
|
||||
|
|
|
@ -0,0 +1,44 @@
|
|||
pytest-2.8.2: bug fixes
|
||||
=======================
|
||||
|
||||
pytest is a mature Python testing tool with more than a 1100 tests
|
||||
against itself, passing on many different interpreters and platforms.
|
||||
This release is supposed to be drop-in compatible to 2.8.1.
|
||||
|
||||
See below for the changes and see docs at:
|
||||
|
||||
http://pytest.org
|
||||
|
||||
As usual, you can upgrade from pypi via::
|
||||
|
||||
pip install -U pytest
|
||||
|
||||
Thanks to all who contributed to this release, among them:
|
||||
|
||||
Bruno Oliveira
|
||||
Demian Brecht
|
||||
Florian Bruhin
|
||||
Ionel Cristian Mărieș
|
||||
Raphael Pierzina
|
||||
Ronny Pfannschmidt
|
||||
holger krekel
|
||||
|
||||
Happy testing,
|
||||
The py.test Development Team
|
||||
|
||||
|
||||
2.8.2 (compared to 2.7.2)
|
||||
-----------------------------
|
||||
|
||||
- fix #1085: proper handling of encoding errors when passing encoded byte
|
||||
strings to pytest.parametrize in Python 2.
|
||||
Thanks Themanwithoutaplan for the report and Bruno Oliveira for the PR.
|
||||
|
||||
- fix #1087: handling SystemError when passing empty byte strings to
|
||||
pytest.parametrize in Python 3.
|
||||
Thanks Paul Kehrer for the report and Bruno Oliveira for the PR.
|
||||
|
||||
- fix #995: fixed internal error when filtering tracebacks where one entry
|
||||
was generated by an exec() statement.
|
||||
Thanks Daniel Hahler, Ashley C Straw, Philippe Gauthier and Pavel Savchenko
|
||||
for contributing and Bruno Oliveira for the PR.
|
|
@ -0,0 +1,59 @@
|
|||
pytest-2.8.3: bug fixes
|
||||
=======================
|
||||
|
||||
pytest is a mature Python testing tool with more than a 1100 tests
|
||||
against itself, passing on many different interpreters and platforms.
|
||||
This release is supposed to be drop-in compatible to 2.8.2.
|
||||
|
||||
See below for the changes and see docs at:
|
||||
|
||||
http://pytest.org
|
||||
|
||||
As usual, you can upgrade from pypi via::
|
||||
|
||||
pip install -U pytest
|
||||
|
||||
Thanks to all who contributed to this release, among them:
|
||||
|
||||
Bruno Oliveira
|
||||
Florian Bruhin
|
||||
Gabe Hollombe
|
||||
Gabriel Reis
|
||||
Hartmut Goebel
|
||||
John Vandenberg
|
||||
Lee Kamentsky
|
||||
Michael Birtwell
|
||||
Raphael Pierzina
|
||||
Ronny Pfannschmidt
|
||||
William Martin Stewart
|
||||
|
||||
Happy testing,
|
||||
The py.test Development Team
|
||||
|
||||
|
||||
2.8.3 (compared to 2.8.2)
|
||||
-----------------------------
|
||||
|
||||
- fix #1169: add __name__ attribute to testcases in TestCaseFunction to
|
||||
support the @unittest.skip decorator on functions and methods.
|
||||
Thanks Lee Kamentsky for the PR.
|
||||
|
||||
- fix #1035: collecting tests if test module level obj has __getattr__().
|
||||
Thanks Suor for the report and Bruno Oliveira / Tom Viner for the PR.
|
||||
|
||||
- fix #331: don't collect tests if their failure cannot be reported correctly
|
||||
e.g. they are a callable instance of a class.
|
||||
|
||||
- fix #1133: fixed internal error when filtering tracebacks where one entry
|
||||
belongs to a file which is no longer available.
|
||||
Thanks Bruno Oliveira for the PR.
|
||||
|
||||
- enhancement made to highlight in red the name of the failing tests so
|
||||
they stand out in the output.
|
||||
Thanks Gabriel Reis for the PR.
|
||||
|
||||
- add more talks to the documentation
|
||||
- extend documentation on the --ignore cli option
|
||||
- use pytest-runner for setuptools integration
|
||||
- minor fixes for interaction with OS X El Capitan system integrity protection (thanks Florian)
|
||||
|
|
@ -0,0 +1,52 @@
|
|||
pytest-2.8.4
|
||||
============
|
||||
|
||||
pytest is a mature Python testing tool with more than a 1100 tests
|
||||
against itself, passing on many different interpreters and platforms.
|
||||
This release is supposed to be drop-in compatible to 2.8.2.
|
||||
|
||||
See below for the changes and see docs at:
|
||||
|
||||
http://pytest.org
|
||||
|
||||
As usual, you can upgrade from pypi via::
|
||||
|
||||
pip install -U pytest
|
||||
|
||||
Thanks to all who contributed to this release, among them:
|
||||
|
||||
Bruno Oliveira
|
||||
Florian Bruhin
|
||||
Jeff Widman
|
||||
Mehdy Khoshnoody
|
||||
Nicholas Chammas
|
||||
Ronny Pfannschmidt
|
||||
Tim Chan
|
||||
|
||||
|
||||
Happy testing,
|
||||
The py.test Development Team
|
||||
|
||||
|
||||
2.8.4 (compared to 2.8.3)
|
||||
-----------------------------
|
||||
|
||||
- fix #1190: ``deprecated_call()`` now works when the deprecated
|
||||
function has been already called by another test in the same
|
||||
module. Thanks Mikhail Chernykh for the report and Bruno Oliveira for the
|
||||
PR.
|
||||
|
||||
- fix #1198: ``--pastebin`` option now works on Python 3. Thanks
|
||||
Mehdy Khoshnoody for the PR.
|
||||
|
||||
- fix #1219: ``--pastebin`` now works correctly when captured output contains
|
||||
non-ascii characters. Thanks Bruno Oliveira for the PR.
|
||||
|
||||
- fix #1204: another error when collecting with a nasty __getattr__().
|
||||
Thanks Florian Bruhin for the PR.
|
||||
|
||||
- fix the summary printed when no tests did run.
|
||||
Thanks Florian Bruhin for the PR.
|
||||
|
||||
- a number of documentation modernizations wrt good practices.
|
||||
Thanks Bruno Oliveira for the PR.
|
|
@ -26,7 +26,7 @@ you will see the return value of the function call::
|
|||
|
||||
$ py.test test_assert1.py
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 1 items
|
||||
|
||||
|
@ -146,7 +146,7 @@ if you run this module::
|
|||
|
||||
$ py.test test_assert2.py
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 1 items
|
||||
|
||||
|
@ -192,8 +192,8 @@ provides an alternative explanation for ``Foo`` objects::
|
|||
from test_foocompare import Foo
|
||||
def pytest_assertrepr_compare(op, left, right):
|
||||
if isinstance(left, Foo) and isinstance(right, Foo) and op == "==":
|
||||
return ['Comparing Foo instances:',
|
||||
' vals: %s != %s' % (left.val, right.val)]
|
||||
return ['Comparing Foo instances:',
|
||||
' vals: %s != %s' % (left.val, right.val)]
|
||||
|
||||
now, given this test module::
|
||||
|
||||
|
|
|
@ -79,8 +79,8 @@ You can ask for available builtin or project-custom
|
|||
cache.get(key, default)
|
||||
cache.set(key, value)
|
||||
|
||||
Keys must be strings not containing a "/" separator. Add a unique identifier
|
||||
(such as plugin/app name) to avoid clashes with other cache users.
|
||||
Keys must be a ``/`` separated value, where the first part is usually the
|
||||
name of your plugin or application to avoid clashes with other cache users.
|
||||
|
||||
Values can be any object handled by the json stdlib module.
|
||||
capsys
|
||||
|
@ -131,4 +131,4 @@ You can ask for available builtin or project-custom
|
|||
directory. The returned object is a `py.path.local`_
|
||||
path object.
|
||||
|
||||
in 0.12 seconds
|
||||
no tests ran in 0.12 seconds
|
||||
|
|
|
@ -5,11 +5,12 @@ Cache: working with cross-testrun state
|
|||
|
||||
.. warning::
|
||||
|
||||
The functionality of this core plugin was previosuly distributed
|
||||
The functionality of this core plugin was previously distributed
|
||||
as a third party plugin named ``pytest-cache``. The core plugin
|
||||
is compatible regarding command line options and API usage except that you
|
||||
can only store/receive data between test runs that is json-serializable.
|
||||
|
||||
|
||||
Usage
|
||||
---------
|
||||
|
||||
|
@ -26,6 +27,12 @@ all cross-session cache contents ahead of a test run.
|
|||
Other plugins may access the `config.cache`_ object to set/get
|
||||
**json encodable** values between ``py.test`` invocations.
|
||||
|
||||
.. note::
|
||||
|
||||
This plugin is enabled by default, but can be disabled if needed: see
|
||||
:ref:`cmdunregister` (the internal name for this plugin is
|
||||
``cacheprovider``).
|
||||
|
||||
|
||||
Rerunning only failures or failures first
|
||||
-----------------------------------------------
|
||||
|
@ -73,7 +80,7 @@ If you then run it with ``--lf``::
|
|||
|
||||
$ py.test --lf
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
run-last-failure: rerun last 2 failures
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 50 items
|
||||
|
@ -114,7 +121,7 @@ of ``FF`` and dots)::
|
|||
|
||||
$ py.test --ff
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
run-last-failure: rerun last 2 failures first
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 50 items
|
||||
|
@ -219,7 +226,7 @@ You can always peek at the content of the cache using the
|
|||
|
||||
$ py.test --cache-clear
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 1 items
|
||||
|
||||
|
|
|
@ -64,7 +64,7 @@ of the failing function and hide the other one::
|
|||
|
||||
$ py.test
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 2 items
|
||||
|
||||
|
|
|
@ -16,16 +16,17 @@ Full pytest documentation
|
|||
plugins
|
||||
cache
|
||||
contributing
|
||||
plugins_index/index
|
||||
talks
|
||||
|
||||
.. only:: html
|
||||
|
||||
.. toctree::
|
||||
|
||||
funcarg_compare
|
||||
announce/index
|
||||
|
||||
.. only:: html
|
||||
|
||||
.. toctree::
|
||||
:hidden:
|
||||
|
||||
|
|
|
@ -162,7 +162,7 @@ Builtin configuration file options
|
|||
.. versionadded:: 2.8
|
||||
|
||||
Sets list of directories that should be searched for tests when
|
||||
no specific directories or files are given in the command line when
|
||||
no specific directories, files or test ids are given in the command line when
|
||||
executing pytest from the :ref:`rootdir <rootdir>` directory.
|
||||
Useful when all project tests are in a known location to speed up
|
||||
test collection and to avoid picking up undesired tests by accident.
|
||||
|
|
|
@ -46,7 +46,7 @@ then you can just invoke ``py.test`` without command line options::
|
|||
|
||||
$ py.test
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile: pytest.ini
|
||||
collected 1 items
|
||||
|
||||
|
|
|
@ -31,7 +31,7 @@ You can then restrict a test run to only run tests marked with ``webtest``::
|
|||
|
||||
$ py.test -v -m webtest
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.4
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.4
|
||||
cachedir: .cache
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collecting ... collected 4 items
|
||||
|
@ -45,7 +45,7 @@ Or the inverse, running all tests except the webtest ones::
|
|||
|
||||
$ py.test -v -m "not webtest"
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.4
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.4
|
||||
cachedir: .cache
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collecting ... collected 4 items
|
||||
|
@ -66,7 +66,7 @@ tests based on their module, class, method, or function name::
|
|||
|
||||
$ py.test -v test_server.py::TestClass::test_method
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.4
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.4
|
||||
cachedir: .cache
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collecting ... collected 5 items
|
||||
|
@ -79,7 +79,7 @@ You can also select on the class::
|
|||
|
||||
$ py.test -v test_server.py::TestClass
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.4
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.4
|
||||
cachedir: .cache
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collecting ... collected 4 items
|
||||
|
@ -92,7 +92,7 @@ Or select multiple nodes::
|
|||
|
||||
$ py.test -v test_server.py::TestClass test_server.py::test_send_http
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.4
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.4
|
||||
cachedir: .cache
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collecting ... collected 8 items
|
||||
|
@ -130,7 +130,7 @@ select tests based on their names::
|
|||
|
||||
$ py.test -v -k http # running with the above defined example module
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.4
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.4
|
||||
cachedir: .cache
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collecting ... collected 4 items
|
||||
|
@ -144,7 +144,7 @@ And you can also run all tests except the ones that match the keyword::
|
|||
|
||||
$ py.test -k "not send_http" -v
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.4
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.4
|
||||
cachedir: .cache
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collecting ... collected 4 items
|
||||
|
@ -160,7 +160,7 @@ Or to select "http" and "quick" tests::
|
|||
|
||||
$ py.test -k "http or quick" -v
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.4
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.4
|
||||
cachedir: .cache
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collecting ... collected 4 items
|
||||
|
@ -219,7 +219,7 @@ For an example on how to add and work with markers from a plugin, see
|
|||
|
||||
.. note::
|
||||
|
||||
It is recommended to explicitely register markers so that:
|
||||
It is recommended to explicitly register markers so that:
|
||||
|
||||
* there is one place in your test suite defining your markers
|
||||
|
||||
|
@ -350,7 +350,7 @@ the test needs::
|
|||
|
||||
$ py.test -E stage2
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 1 items
|
||||
|
||||
|
@ -362,7 +362,7 @@ and here is one that specifies exactly the environment needed::
|
|||
|
||||
$ py.test -E stage1
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 1 items
|
||||
|
||||
|
@ -481,7 +481,7 @@ then you will see two test skipped and two executed tests as expected::
|
|||
|
||||
$ py.test -rs # this option reports skip reasons
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 4 items
|
||||
|
||||
|
@ -495,7 +495,7 @@ Note that if you specify a platform via the marker-command line option like this
|
|||
|
||||
$ py.test -m linux2
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 4 items
|
||||
|
||||
|
@ -547,7 +547,7 @@ We can now use the ``-m option`` to select one set::
|
|||
|
||||
$ py.test -m interface --tb=short
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 4 items
|
||||
|
||||
|
@ -569,7 +569,7 @@ or to select both "event" and "interface" tests::
|
|||
|
||||
$ py.test -m "interface or event" --tb=short
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 4 items
|
||||
|
||||
|
|
|
@ -27,11 +27,11 @@ now execute the test specification::
|
|||
|
||||
nonpython $ py.test test_simple.yml
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR/nonpython, inifile:
|
||||
collected 2 items
|
||||
|
||||
test_simple.yml F.
|
||||
test_simple.yml .F
|
||||
|
||||
======= FAILURES ========
|
||||
_______ usecase: hello ________
|
||||
|
@ -59,13 +59,13 @@ consulted when reporting in ``verbose`` mode::
|
|||
|
||||
nonpython $ py.test -v
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.4
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.4
|
||||
cachedir: .cache
|
||||
rootdir: $REGENDOC_TMPDIR/nonpython, inifile:
|
||||
collecting ... collected 2 items
|
||||
|
||||
test_simple.yml::hello FAILED
|
||||
test_simple.yml::ok PASSED
|
||||
test_simple.yml::hello FAILED
|
||||
|
||||
======= FAILURES ========
|
||||
_______ usecase: hello ________
|
||||
|
@ -81,11 +81,11 @@ interesting to just look at the collection tree::
|
|||
|
||||
nonpython $ py.test --collect-only
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR/nonpython, inifile:
|
||||
collected 2 items
|
||||
<YamlFile 'test_simple.yml'>
|
||||
<YamlItem 'hello'>
|
||||
<YamlItem 'ok'>
|
||||
<YamlItem 'hello'>
|
||||
|
||||
======= in 0.12 seconds ========
|
||||
======= no tests ran in 0.12 seconds ========
|
||||
|
|
|
@ -130,7 +130,7 @@ objects, they are still using the default pytest representation::
|
|||
|
||||
$ py.test test_time.py --collect-only
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 6 items
|
||||
<Module 'test_time.py'>
|
||||
|
@ -141,7 +141,7 @@ objects, they are still using the default pytest representation::
|
|||
<Function 'test_timedistance_v2[20011212-20011211-expected0]'>
|
||||
<Function 'test_timedistance_v2[20011211-20011212-expected1]'>
|
||||
|
||||
======= in 0.12 seconds ========
|
||||
======= no tests ran in 0.12 seconds ========
|
||||
|
||||
A quick port of "testscenarios"
|
||||
------------------------------------
|
||||
|
@ -181,7 +181,7 @@ this is a fully self-contained example which you can run with::
|
|||
|
||||
$ py.test test_scenarios.py
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 4 items
|
||||
|
||||
|
@ -194,7 +194,7 @@ If you just collect tests you'll also nicely see 'advanced' and 'basic' as varia
|
|||
|
||||
$ py.test --collect-only test_scenarios.py
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 4 items
|
||||
<Module 'test_scenarios.py'>
|
||||
|
@ -205,7 +205,7 @@ If you just collect tests you'll also nicely see 'advanced' and 'basic' as varia
|
|||
<Function 'test_demo1[advanced]'>
|
||||
<Function 'test_demo2[advanced]'>
|
||||
|
||||
======= in 0.12 seconds ========
|
||||
======= no tests ran in 0.12 seconds ========
|
||||
|
||||
Note that we told ``metafunc.parametrize()`` that your scenario values
|
||||
should be considered class-scoped. With pytest-2.3 this leads to a
|
||||
|
@ -259,14 +259,14 @@ Let's first see how it looks like at collection time::
|
|||
|
||||
$ py.test test_backends.py --collect-only
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 2 items
|
||||
<Module 'test_backends.py'>
|
||||
<Function 'test_db_initialized[d1]'>
|
||||
<Function 'test_db_initialized[d2]'>
|
||||
|
||||
======= in 0.12 seconds ========
|
||||
======= no tests ran in 0.12 seconds ========
|
||||
|
||||
And then when we run the test::
|
||||
|
||||
|
@ -320,25 +320,25 @@ The result of this test will be successful::
|
|||
|
||||
$ py.test test_indirect_list.py --collect-only
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 1 items
|
||||
<Module 'test_indirect_list.py'>
|
||||
<Function 'test_indirect[a-b]'>
|
||||
|
||||
======= in 0.12 seconds ========
|
||||
======= no tests ran in 0.12 seconds ========
|
||||
|
||||
.. regendoc:wipe
|
||||
|
||||
Parametrizing test methods through per-class configuration
|
||||
--------------------------------------------------------------
|
||||
|
||||
.. _`unittest parameterizer`: http://code.google.com/p/unittest-ext/source/browse/trunk/params.py
|
||||
.. _`unittest parametrizer`: http://code.google.com/p/unittest-ext/source/browse/trunk/params.py
|
||||
|
||||
|
||||
Here is an example ``pytest_generate_function`` function implementing a
|
||||
parametrization scheme similar to Michael Foord's `unittest
|
||||
parameterizer`_ but in a lot less code::
|
||||
parametrizer`_ but in a lot less code::
|
||||
|
||||
# content of ./test_parametrize.py
|
||||
import pytest
|
||||
|
@ -397,8 +397,11 @@ is to be run with different sets of arguments for its three arguments:
|
|||
Running it results in some skips if we don't have all the python interpreters installed and otherwise runs all combinations (5 interpreters times 5 interpreters times 3 objects to serialize/deserialize)::
|
||||
|
||||
. $ py.test -rs -q multipython.py
|
||||
...........................
|
||||
27 passed in 0.12 seconds
|
||||
ssssssssssss...ssssssssssss
|
||||
======= short test summary info ========
|
||||
SKIP [12] $REGENDOC_TMPDIR/CWD/multipython.py:22: 'python2.6' not found
|
||||
SKIP [12] $REGENDOC_TMPDIR/CWD/multipython.py:22: 'python3.3' not found
|
||||
3 passed, 24 skipped in 0.12 seconds
|
||||
|
||||
Indirect parametrization of optional implementations/imports
|
||||
--------------------------------------------------------------------
|
||||
|
@ -445,7 +448,7 @@ If you run this with reporting for skips enabled::
|
|||
|
||||
$ py.test -rs test_module.py
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 2 items
|
||||
|
||||
|
|
|
@ -1,6 +1,45 @@
|
|||
Changing standard (Python) test discovery
|
||||
===============================================
|
||||
|
||||
Ignore paths during test collection
|
||||
-----------------------------------
|
||||
|
||||
You can easily ignore certain test directories and modules during collection
|
||||
by passing the ``--ignore=path`` option on the cli. ``pytest`` allows multiple
|
||||
``--ignore`` options. Example::
|
||||
|
||||
tests/
|
||||
├── example
|
||||
│ ├── test_example_01.py
|
||||
│ ├── test_example_02.py
|
||||
│ └── test_example_03.py
|
||||
├── foobar
|
||||
│ ├── test_foobar_01.py
|
||||
│ ├── test_foobar_02.py
|
||||
│ └── test_foobar_03.py
|
||||
└── hello
|
||||
└── world
|
||||
├── test_world_01.py
|
||||
├── test_world_02.py
|
||||
└── test_world_03.py
|
||||
|
||||
Now if you invoke ``pytest`` with ``--ignore=tests/foobar/test_foobar_03.py --ignore=tests/hello/``,
|
||||
you will see that ``pytest`` only collects test-modules, which do not match the patterns specified::
|
||||
|
||||
========= test session starts ==========
|
||||
platform darwin -- Python 2.7.10, pytest-2.8.2, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 5 items
|
||||
|
||||
tests/example/test_example_01.py .
|
||||
tests/example/test_example_02.py .
|
||||
tests/example/test_example_03.py .
|
||||
tests/foobar/test_foobar_01.py .
|
||||
tests/foobar/test_foobar_02.py .
|
||||
|
||||
======= 5 passed in 0.02 seconds =======
|
||||
|
||||
|
||||
Changing directory recursion
|
||||
-----------------------------------------------------
|
||||
|
||||
|
@ -43,7 +82,7 @@ then the test collection looks like this::
|
|||
|
||||
$ py.test --collect-only
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile: setup.cfg
|
||||
collected 2 items
|
||||
<Module 'check_myapp.py'>
|
||||
|
@ -52,7 +91,7 @@ then the test collection looks like this::
|
|||
<Function 'simple_check'>
|
||||
<Function 'complex_check'>
|
||||
|
||||
======= in 0.12 seconds ========
|
||||
======= no tests ran in 0.12 seconds ========
|
||||
|
||||
.. note::
|
||||
|
||||
|
@ -89,7 +128,7 @@ You can always peek at the collection tree without running tests like this::
|
|||
|
||||
. $ py.test --collect-only pythoncollection.py
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile: pytest.ini
|
||||
collected 3 items
|
||||
<Module 'CWD/pythoncollection.py'>
|
||||
|
@ -99,7 +138,7 @@ You can always peek at the collection tree without running tests like this::
|
|||
<Function 'test_method'>
|
||||
<Function 'test_anothermethod'>
|
||||
|
||||
======= in 0.12 seconds ========
|
||||
======= no tests ran in 0.12 seconds ========
|
||||
|
||||
customizing test collection to find all .py files
|
||||
---------------------------------------------------------
|
||||
|
@ -136,18 +175,18 @@ And then if you have a module file like this::
|
|||
and a setup.py dummy file like this::
|
||||
|
||||
# content of setup.py
|
||||
0/0 # will raise exeption if imported
|
||||
0/0 # will raise exception if imported
|
||||
|
||||
then a pytest run on python2 will find the one test when run with a python2
|
||||
interpreters and will leave out the setup.py file::
|
||||
|
||||
$ py.test --collect-only
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile: pytest.ini
|
||||
collected 0 items
|
||||
|
||||
======= in 0.12 seconds ========
|
||||
======= no tests ran in 0.12 seconds ========
|
||||
|
||||
If you run with a Python3 interpreter the moduled added through the conftest.py file will not be considered for test collection.
|
||||
|
||||
|
|
|
@ -13,7 +13,7 @@ get on the terminal - we are working on that):
|
|||
|
||||
assertion $ py.test failure_demo.py
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR/assertion, inifile:
|
||||
collected 42 items
|
||||
|
||||
|
@ -361,7 +361,7 @@ get on the terminal - we are working on that):
|
|||
> int(s)
|
||||
E ValueError: invalid literal for int() with base 10: 'qwe'
|
||||
|
||||
<0-codegen $PYTHON_PREFIX/lib/python3.4/site-packages/_pytest/python.py:1247>:1: ValueError
|
||||
<0-codegen $PYTHON_PREFIX/lib/python3.4/site-packages/_pytest/python.py:1300>:1: ValueError
|
||||
_______ TestRaises.test_raises_doesnt ________
|
||||
|
||||
self = <failure_demo.TestRaises object at 0xdeadbeef>
|
||||
|
|
|
@ -108,11 +108,11 @@ directory with the above conftest.py::
|
|||
|
||||
$ py.test
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 0 items
|
||||
|
||||
======= in 0.12 seconds ========
|
||||
======= no tests ran in 0.12 seconds ========
|
||||
|
||||
.. _`excontrolskip`:
|
||||
|
||||
|
@ -131,20 +131,23 @@ line option to control skipping of ``slow`` marked tests::
|
|||
parser.addoption("--runslow", action="store_true",
|
||||
help="run slow tests")
|
||||
|
||||
def pytest_runtest_setup(item):
|
||||
if 'slow' in item.keywords and not item.config.getoption("--runslow"):
|
||||
pytest.skip("need --runslow option to run")
|
||||
|
||||
We can now write a test module like this::
|
||||
|
||||
# content of test_module.py
|
||||
|
||||
import pytest
|
||||
slow = pytest.mark.slow
|
||||
|
||||
|
||||
slow = pytest.mark.skipif(
|
||||
not pytest.config.getoption("--runslow"),
|
||||
reason="need --runslow option to run"
|
||||
)
|
||||
|
||||
|
||||
def test_func_fast():
|
||||
pass
|
||||
|
||||
|
||||
@slow
|
||||
def test_func_slow():
|
||||
pass
|
||||
|
@ -153,13 +156,13 @@ and when running it will see a skipped "slow" test::
|
|||
|
||||
$ py.test -rs # "-rs" means report details on the little 's'
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 2 items
|
||||
|
||||
test_module.py .s
|
||||
======= short test summary info ========
|
||||
SKIP [1] $REGENDOC_TMPDIR/conftest.py:9: need --runslow option to run
|
||||
SKIP [1] test_module.py:14: need --runslow option to run
|
||||
|
||||
======= 1 passed, 1 skipped in 0.12 seconds ========
|
||||
|
||||
|
@ -167,7 +170,7 @@ Or run it including the ``slow`` marked test::
|
|||
|
||||
$ py.test --runslow
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 2 items
|
||||
|
||||
|
@ -259,12 +262,12 @@ which will add the string to the test header accordingly::
|
|||
|
||||
$ py.test
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
project deps: mylib-1.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 0 items
|
||||
|
||||
======= in 0.12 seconds ========
|
||||
======= no tests ran in 0.12 seconds ========
|
||||
|
||||
.. regendoc:wipe
|
||||
|
||||
|
@ -283,24 +286,24 @@ which will add info only when run with "--v"::
|
|||
|
||||
$ py.test -v
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.4
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.4
|
||||
cachedir: .cache
|
||||
info1: did you know that ...
|
||||
did you?
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collecting ... collected 0 items
|
||||
|
||||
======= in 0.12 seconds ========
|
||||
======= no tests ran in 0.12 seconds ========
|
||||
|
||||
and nothing when run plainly::
|
||||
|
||||
$ py.test
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 0 items
|
||||
|
||||
======= in 0.12 seconds ========
|
||||
======= no tests ran in 0.12 seconds ========
|
||||
|
||||
profiling test duration
|
||||
--------------------------
|
||||
|
@ -310,7 +313,7 @@ profiling test duration
|
|||
.. versionadded: 2.2
|
||||
|
||||
If you have a slow running large test suite you might want to find
|
||||
out which tests are the slowest. Let's make an artifical test suite::
|
||||
out which tests are the slowest. Let's make an artificial test suite::
|
||||
|
||||
# content of test_some_are_slow.py
|
||||
|
||||
|
@ -329,7 +332,7 @@ Now we can profile which test functions execute the slowest::
|
|||
|
||||
$ py.test --durations=3
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 3 items
|
||||
|
||||
|
@ -338,7 +341,7 @@ Now we can profile which test functions execute the slowest::
|
|||
======= slowest 3 test durations ========
|
||||
0.20s call test_some_are_slow.py::test_funcslow2
|
||||
0.10s call test_some_are_slow.py::test_funcslow1
|
||||
0.00s setup test_some_are_slow.py::test_funcslow2
|
||||
0.00s teardown test_some_are_slow.py::test_funcslow2
|
||||
======= 3 passed in 0.12 seconds ========
|
||||
|
||||
incremental testing - test steps
|
||||
|
@ -391,7 +394,7 @@ If we run this::
|
|||
|
||||
$ py.test -rx
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 4 items
|
||||
|
||||
|
@ -424,7 +427,7 @@ by placing fixture functions in a ``conftest.py`` file in that directory
|
|||
You can use all types of fixtures including :ref:`autouse fixtures
|
||||
<autouse fixtures>` which are the equivalent of xUnit's setup/teardown
|
||||
concept. It's however recommended to have explicit fixture references in your
|
||||
tests or test classes rather than relying on implicitely executing
|
||||
tests or test classes rather than relying on implicitly executing
|
||||
setup/teardown functions, especially if they are far away from the actual tests.
|
||||
|
||||
Here is a an example for making a ``db`` fixture available in a directory::
|
||||
|
@ -462,7 +465,7 @@ We can run this::
|
|||
|
||||
$ py.test
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 7 items
|
||||
|
||||
|
@ -476,7 +479,7 @@ We can run this::
|
|||
file $REGENDOC_TMPDIR/b/test_error.py, line 1
|
||||
def test_root(db): # no db here, will error out
|
||||
fixture 'db' not found
|
||||
available fixtures: tmpdir, pytestconfig, record_xml_property, monkeypatch, recwarn, tmpdir_factory, capsys, capfd, cache
|
||||
available fixtures: tmpdir, record_xml_property, cache, capsys, monkeypatch, recwarn, pytestconfig, tmpdir_factory, capfd
|
||||
use 'py.test --fixtures [testpath]' for help on them.
|
||||
|
||||
$REGENDOC_TMPDIR/b/test_error.py:1
|
||||
|
@ -566,7 +569,7 @@ and run them::
|
|||
|
||||
$ py.test test_module.py
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 2 items
|
||||
|
||||
|
@ -657,7 +660,7 @@ and run it::
|
|||
|
||||
$ py.test -s test_module.py
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 3 items
|
||||
|
||||
|
|
|
@ -60,7 +60,7 @@ and customizable testing framework for Python. Note, however, that
|
|||
thus likely not something for Python beginners.
|
||||
|
||||
A second "magic" issue was the assert statement debugging feature.
|
||||
Nowadays, ``pytest`` explicitely rewrites assert statements in test modules
|
||||
Nowadays, ``pytest`` explicitly rewrites assert statements in test modules
|
||||
in order to provide more useful :ref:`assert feedback <assertfeedback>`.
|
||||
This completely avoids previous issues of confusing assertion-reporting.
|
||||
It also means, that you can use Python's ``-O`` optimization without losing
|
||||
|
@ -76,7 +76,7 @@ be the same, confusing the reinterpreter and obfuscating the initial
|
|||
error (this is also explained at the command line if it happens).
|
||||
|
||||
You can also turn off all assertion interaction using the
|
||||
``--assertmode=off`` option.
|
||||
``--assert=plain`` option.
|
||||
|
||||
.. _`py namespaces`: index.html
|
||||
.. _`py/__init__.py`: http://bitbucket.org/hpk42/py-trunk/src/trunk/py/__init__.py
|
||||
|
@ -141,10 +141,10 @@ However, with pytest-2.3 you can use the :ref:`@pytest.fixture` decorator
|
|||
and specify ``params`` so that all tests depending on the factory-created
|
||||
resource will run multiple times with different parameters.
|
||||
|
||||
You can also use the `pytest_generate_tests`_ hook to
|
||||
implement the `parametrization scheme of your choice`_.
|
||||
You can also use the ``pytest_generate_tests`` hook to
|
||||
implement the `parametrization scheme of your choice`_. See also
|
||||
:ref:`paramexamples` for more examples.
|
||||
|
||||
.. _`pytest_generate_tests`: test/funcargs.html#parametrizing-tests
|
||||
.. _`parametrization scheme of your choice`: http://tetamap.wordpress.com/2009/05/13/parametrizing-python-tests-generalized/
|
||||
|
||||
pytest interaction with other packages
|
||||
|
|
|
@ -75,7 +75,7 @@ marked ``smtp`` fixture function. Running the test looks like this::
|
|||
|
||||
$ py.test test_smtpsimple.py
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 1 items
|
||||
|
||||
|
@ -193,7 +193,7 @@ inspect what is going on and can now run the tests::
|
|||
|
||||
$ py.test test_module.py
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 2 items
|
||||
|
||||
|
@ -480,7 +480,7 @@ Running the above tests results in the following test IDs being used::
|
|||
|
||||
$ py.test --collect-only
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 10 items
|
||||
<Module 'test_anothersmtp.py'>
|
||||
|
@ -497,7 +497,7 @@ Running the above tests results in the following test IDs being used::
|
|||
<Function 'test_ehlo[mail.python.org]'>
|
||||
<Function 'test_noop[mail.python.org]'>
|
||||
|
||||
======= in 0.12 seconds ========
|
||||
======= no tests ran in 0.12 seconds ========
|
||||
|
||||
.. _`interdependent fixtures`:
|
||||
|
||||
|
@ -531,7 +531,7 @@ Here we declare an ``app`` fixture which receives the previously defined
|
|||
|
||||
$ py.test -v test_appsetup.py
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.4
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.4
|
||||
cachedir: .cache
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collecting ... collected 2 items
|
||||
|
@ -597,7 +597,7 @@ Let's run the tests in verbose mode and with looking at the print-output::
|
|||
|
||||
$ py.test -v -s test_module.py
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.4
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1 -- $PYTHON_PREFIX/bin/python3.4
|
||||
cachedir: .cache
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collecting ... collected 8 items
|
||||
|
|
|
@ -209,7 +209,7 @@ fixtures:
|
|||
and let pytest figure things out for you.
|
||||
|
||||
* if you used parametrization and funcarg factories which made use of
|
||||
``request.cached_setup()`` it is recommeneded to invest a few minutes
|
||||
``request.cached_setup()`` it is recommended to invest a few minutes
|
||||
and simplify your fixture function code to use the :ref:`@pytest.fixture`
|
||||
decorator instead. This will also allow to take advantage of
|
||||
the automatic per-resource grouping of tests.
|
||||
|
|
|
@ -27,7 +27,7 @@ Installation options::
|
|||
To check your installation has installed the correct version::
|
||||
|
||||
$ py.test --version
|
||||
This is pytest version 2.8.1, imported from $PYTHON_PREFIX/lib/python3.4/site-packages/pytest.py
|
||||
This is pytest version 2.8.4, imported from $PYTHON_PREFIX/lib/python3.4/site-packages/pytest.py
|
||||
|
||||
If you get an error checkout :ref:`installation issues`.
|
||||
|
||||
|
@ -49,7 +49,7 @@ That's it. You can execute the test function now::
|
|||
|
||||
$ py.test
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 1 items
|
||||
|
||||
|
@ -106,8 +106,6 @@ Running it with, this time in "quiet" reporting mode::
|
|||
.
|
||||
1 passed in 0.12 seconds
|
||||
|
||||
.. todo:: For further ways to assert exceptions see the `raises`
|
||||
|
||||
Grouping multiple tests in a class
|
||||
--------------------------------------------------------------
|
||||
|
||||
|
|
|
@ -4,29 +4,28 @@
|
|||
Good Integration Practices
|
||||
=================================================
|
||||
|
||||
Work with virtual environments
|
||||
-----------------------------------------------------------
|
||||
|
||||
We recommend to use virtualenv_ environments and use pip_
|
||||
(or easy_install_) for installing your application and any dependencies
|
||||
as well as the ``pytest`` package itself. This way you will get an isolated
|
||||
and reproducible environment. Given you have installed virtualenv_
|
||||
and execute it from the command line, here is an example session for unix
|
||||
or windows::
|
||||
.. _`test discovery`:
|
||||
.. _`Python test discovery`:
|
||||
|
||||
virtualenv . # create a virtualenv directory in the current directory
|
||||
Conventions for Python test discovery
|
||||
-------------------------------------------------
|
||||
|
||||
source bin/activate # on unix
|
||||
``pytest`` implements the following standard test discovery:
|
||||
|
||||
scripts/activate # on Windows
|
||||
* If no arguments are specified then collection starts from :confval:`testpaths`
|
||||
(if configured) or the current directory. Alternatively, command line arguments
|
||||
can be used in any combination of directories, file names or node ids.
|
||||
* recurse into directories, unless they match :confval:`norecursedirs`
|
||||
* ``test_*.py`` or ``*_test.py`` files, imported by their `test package name`_.
|
||||
* ``Test`` prefixed test classes (without an ``__init__`` method)
|
||||
* ``test_`` prefixed test functions or methods are test items
|
||||
|
||||
We can now install pytest::
|
||||
For examples of how to customize your test discovery :doc:`example/pythoncollection`.
|
||||
|
||||
pip install pytest
|
||||
Within Python modules, ``pytest`` also discovers tests using the standard
|
||||
:ref:`unittest.TestCase <unittest.TestCase>` subclassing technique.
|
||||
|
||||
Due to the ``activate`` step above the ``pip`` will come from
|
||||
the virtualenv directory and install any package into the isolated
|
||||
virtual environment.
|
||||
|
||||
Choosing a test layout / import rules
|
||||
------------------------------------------
|
||||
|
@ -135,8 +134,13 @@ required configurations.
|
|||
|
||||
.. _`use tox`:
|
||||
|
||||
Use tox and Continuous Integration servers
|
||||
-------------------------------------------------
|
||||
Tox
|
||||
------
|
||||
|
||||
For development, we recommend to use virtualenv_ environments and pip_
|
||||
for installing your application and any dependencies
|
||||
as well as the ``pytest`` package itself. This ensures your code and
|
||||
dependencies are isolated from the system Python installation.
|
||||
|
||||
If you frequently release code and want to make sure that your actual
|
||||
package passes all tests you may want to look into `tox`_, the
|
||||
|
@ -148,89 +152,56 @@ options. It will run tests against the installed package and not
|
|||
against your source code checkout, helping to detect packaging
|
||||
glitches.
|
||||
|
||||
If you want to use Jenkins_ you can use the ``--junitxml=PATH`` option
|
||||
to create a JUnitXML file that Jenkins_ can pick up and generate reports.
|
||||
|
||||
.. _standalone:
|
||||
.. _`genscript method`:
|
||||
|
||||
(deprecated) Create a pytest standalone script
|
||||
-----------------------------------------------
|
||||
|
||||
If you are a maintainer or application developer and want people
|
||||
who don't deal with python much to easily run tests you may generate
|
||||
a standalone ``pytest`` script::
|
||||
|
||||
py.test --genscript=runtests.py
|
||||
|
||||
This generates a ``runtests.py`` script which is a fully functional basic
|
||||
``pytest`` script, running unchanged under Python2 and Python3.
|
||||
You can tell people to download the script and then e.g. run it like this::
|
||||
|
||||
python runtests.py
|
||||
|
||||
.. note::
|
||||
|
||||
You must have pytest and its dependencies installed as an sdist, not
|
||||
as wheels because genscript need the source code for generating a
|
||||
standalone script.
|
||||
Continuous integration services such as Jenkins_ can make use of the
|
||||
``--junitxml=PATH`` option to create a JUnitXML file and generate reports.
|
||||
|
||||
|
||||
Integrating with setuptools / ``python setup.py test`` / ``pytest-runner``
|
||||
--------------------------------------------------------------------------
|
||||
|
||||
You can integrate test runs into your setuptools based project
|
||||
with the `pytest-runner <https://pypi.python.org/pypi/pytest-runner>`_ plugin.
|
||||
|
||||
Integrating with setuptools / ``python setup.py test``
|
||||
------------------------------------------------------
|
||||
Add this to ``setup.py`` file:
|
||||
|
||||
You can integrate test runs into your
|
||||
setuptools based project. Use the `genscript method`_
|
||||
to generate a standalone ``pytest`` script::
|
||||
|
||||
py.test --genscript=runtests.py
|
||||
|
||||
and make this script part of your distribution and then add
|
||||
this to your ``setup.py`` file::
|
||||
|
||||
from distutils.core import setup, Command
|
||||
# you can also import from setuptools
|
||||
|
||||
class PyTest(Command):
|
||||
user_options = []
|
||||
def initialize_options(self):
|
||||
pass
|
||||
|
||||
def finalize_options(self):
|
||||
pass
|
||||
|
||||
def run(self):
|
||||
import subprocess
|
||||
import sys
|
||||
errno = subprocess.call([sys.executable, 'runtests.py'])
|
||||
raise SystemExit(errno)
|
||||
.. code-block:: python
|
||||
|
||||
from setuptools import setup
|
||||
|
||||
setup(
|
||||
#...,
|
||||
cmdclass = {'test': PyTest},
|
||||
setup_requires=['pytest-runner', ...],
|
||||
tests_require=['pytest', ...],
|
||||
#...,
|
||||
)
|
||||
|
||||
|
||||
And create an alias into ``setup.cfg`` file:
|
||||
|
||||
|
||||
.. code-block:: ini
|
||||
|
||||
[aliases]
|
||||
test=pytest
|
||||
|
||||
If you now type::
|
||||
|
||||
python setup.py test
|
||||
|
||||
this will execute your tests using ``runtests.py``. As this is a
|
||||
this will execute your tests using ``pytest-runner``. As this is a
|
||||
standalone version of ``pytest`` no prior installation whatsoever is
|
||||
required for calling the test command. You can also pass additional
|
||||
arguments to the subprocess-calls such as your test directory or other
|
||||
options.
|
||||
arguments to py.test such as your test directory or other
|
||||
options using ``--addopts``.
|
||||
|
||||
|
||||
Integration with setuptools test commands
|
||||
----------------------------------------------------
|
||||
Manual Integration
|
||||
^^^^^^^^^^^^^^^^^^
|
||||
|
||||
Setuptools supports writing our own Test command for invoking pytest.
|
||||
Most often it is better to use tox_ instead, but here is how you can
|
||||
get started with setuptools integration::
|
||||
If for some reason you don't want/can't use ``pytest-runner``, you can write
|
||||
your own setuptools Test command for invoking pytest.
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
import sys
|
||||
|
||||
|
@ -244,11 +215,6 @@ get started with setuptools integration::
|
|||
TestCommand.initialize_options(self)
|
||||
self.pytest_args = []
|
||||
|
||||
def finalize_options(self):
|
||||
TestCommand.finalize_options(self)
|
||||
self.test_args = []
|
||||
self.test_suite = True
|
||||
|
||||
def run_tests(self):
|
||||
#import here, cause outside the eggs aren't loaded
|
||||
import pytest
|
||||
|
@ -274,32 +240,39 @@ using the ``--pytest-args`` or ``-a`` command-line option. For example::
|
|||
|
||||
is equivalent to running ``py.test --durations=5``.
|
||||
|
||||
.. seealso::
|
||||
|
||||
For a more powerful solution, take a look at the
|
||||
`pytest-runner <https://pypi.python.org/pypi/pytest-runner>`_ plugin.
|
||||
.. _standalone:
|
||||
.. _`genscript method`:
|
||||
|
||||
.. _`test discovery`:
|
||||
.. _`Python test discovery`:
|
||||
(deprecated) Create a pytest standalone script
|
||||
-----------------------------------------------
|
||||
|
||||
Conventions for Python test discovery
|
||||
-------------------------------------------------
|
||||
.. deprecated:: 2.8
|
||||
|
||||
``pytest`` implements the following standard test discovery:
|
||||
.. note::
|
||||
|
||||
* collection starts from paths specified in :confval:`testpaths` if configured,
|
||||
otherwise from initial command line arguments which may be directories,
|
||||
filenames or test ids. If :confval:`testpaths` is not configured and no
|
||||
directories or files were given in the command line, start collection from
|
||||
the current directory.
|
||||
* recurse into directories, unless they match :confval:`norecursedirs`
|
||||
* ``test_*.py`` or ``*_test.py`` files, imported by their `test package name`_.
|
||||
* ``Test`` prefixed test classes (without an ``__init__`` method)
|
||||
* ``test_`` prefixed test functions or methods are test items
|
||||
``genscript`` has been deprecated because:
|
||||
|
||||
For examples of how to customize your test discovery :doc:`example/pythoncollection`.
|
||||
* It cannot support plugins, rendering its usefulness extremely limited;
|
||||
* Tooling has become much better since ``genscript`` was introduced;
|
||||
* It is possible to build a zipped ``pytest`` application without the
|
||||
shortcomings above.
|
||||
|
||||
There's no planned version in which this command will be removed
|
||||
at the moment of this writing, but its use is discouraged for new
|
||||
applications.
|
||||
|
||||
If you are a maintainer or application developer and want people
|
||||
who don't deal with python much to easily run tests you may generate
|
||||
a standalone ``pytest`` script::
|
||||
|
||||
py.test --genscript=runtests.py
|
||||
|
||||
This generates a ``runtests.py`` script which is a fully functional basic
|
||||
``pytest`` script, running unchanged under Python2 and Python3.
|
||||
You can tell people to download the script and then e.g. run it like this::
|
||||
|
||||
python runtests.py
|
||||
|
||||
Within Python modules, ``pytest`` also discovers tests using the standard
|
||||
:ref:`unittest.TestCase <unittest.TestCase>` subclassing technique.
|
||||
|
||||
.. include:: links.inc
|
||||
|
|
|
@ -55,7 +55,7 @@ them in turn::
|
|||
|
||||
$ py.test
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 3 items
|
||||
|
||||
|
@ -103,7 +103,7 @@ Let's run this::
|
|||
|
||||
$ py.test
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 3 items
|
||||
|
||||
|
@ -196,12 +196,12 @@ As expected our test function fails.
|
|||
|
||||
If you don't specify a stringinput it will be skipped because
|
||||
``metafunc.parametrize()`` will be called with an empty parameter
|
||||
listlist::
|
||||
list::
|
||||
|
||||
$ py.test -q -rs test_strings.py
|
||||
s
|
||||
======= short test summary info ========
|
||||
SKIP [1] $PYTHON_PREFIX/lib/python3.4/site-packages/_pytest/python.py:1364: got empty parameter set, function test_valid_string at $REGENDOC_TMPDIR/test_strings.py:1
|
||||
SKIP [1] $PYTHON_PREFIX/lib/python3.4/site-packages/_pytest/python.py:1417: got empty parameter set, function test_valid_string at $REGENDOC_TMPDIR/test_strings.py:1
|
||||
1 skipped in 0.12 seconds
|
||||
|
||||
For further examples, you might want to look at :ref:`more
|
||||
|
|
|
@ -14,10 +14,9 @@ Installing a third party plugin can be easily done with ``pip``::
|
|||
pip uninstall pytest-NAME
|
||||
|
||||
If a plugin is installed, ``pytest`` automatically finds and integrates it,
|
||||
there is no need to activate it. We have a :doc:`page listing
|
||||
all 3rd party plugins and their status against the latest py.test version
|
||||
<plugins_index/index>` and here is a little annotated list
|
||||
for some popular plugins:
|
||||
there is no need to activate it.
|
||||
|
||||
Here is a little annotated list for some popular plugins:
|
||||
|
||||
.. _`django`: https://www.djangoproject.com/
|
||||
|
||||
|
@ -28,7 +27,7 @@ for some popular plugins:
|
|||
for `twisted <http://twistedmatrix.com>`_ apps, starting a reactor and
|
||||
processing deferreds from test functions.
|
||||
|
||||
* `pytest-capturelog <http://pypi.python.org/pypi/pytest-capturelog>`_:
|
||||
* `pytest-catchlog <http://pypi.python.org/pypi/pytest-catchlog>`_:
|
||||
to capture and assert about messages from the logging module
|
||||
|
||||
* `pytest-cov <http://pypi.python.org/pypi/pytest-cov>`_:
|
||||
|
@ -50,15 +49,14 @@ for some popular plugins:
|
|||
* `pytest-timeout <http://pypi.python.org/pypi/pytest-timeout>`_:
|
||||
to timeout tests based on function marks or global definitions.
|
||||
|
||||
* `pytest-cache <http://pypi.python.org/pypi/pytest-cache>`_:
|
||||
to interactively re-run failing tests and help other plugins to
|
||||
store test run information across invocations.
|
||||
|
||||
* `pytest-pep8 <http://pypi.python.org/pypi/pytest-pep8>`_:
|
||||
a ``--pep8`` option to enable PEP8 compliance checking.
|
||||
|
||||
* `pytest-flakes <https://pypi.python.org/pypi/pytest-flakes>`_:
|
||||
check source code with pyflakes.
|
||||
|
||||
* `oejskit <http://pypi.python.org/pypi/oejskit>`_:
|
||||
a plugin to run javascript unittests in life browsers
|
||||
a plugin to run javascript unittests in life browsers.
|
||||
|
||||
To see a complete list of all plugins with their latest testing
|
||||
status against different py.test and Python versions, please visit
|
||||
|
@ -108,8 +106,21 @@ You can prevent plugins from loading or unregister them::
|
|||
py.test -p no:NAME
|
||||
|
||||
This means that any subsequent try to activate/load the named
|
||||
plugin will it already existing. See :ref:`findpluginname` for
|
||||
how to obtain the name of a plugin.
|
||||
plugin will not work.
|
||||
|
||||
If you want to unconditionally disable a plugin for a project, you can add
|
||||
this option to your ``pytest.ini`` file:
|
||||
|
||||
.. code-block:: ini
|
||||
|
||||
[pytest]
|
||||
addopts = -p no:NAME
|
||||
|
||||
Alternatively to disable it only in certain environments (for example in a
|
||||
CI server), you can set ``PYTEST_ADDOPTS`` environment variable to
|
||||
``-p no:name``.
|
||||
|
||||
See :ref:`findpluginname` for how to obtain the name of a plugin.
|
||||
|
||||
.. _`builtin plugins`:
|
||||
|
||||
|
@ -123,6 +134,7 @@ in the `pytest repository <https://github.com/pytest-dev/pytest>`_.
|
|||
.. autosummary::
|
||||
|
||||
_pytest.assertion
|
||||
_pytest.cacheprovider
|
||||
_pytest.capture
|
||||
_pytest.config
|
||||
_pytest.doctest
|
||||
|
|
Binary file not shown.
Before Width: | Height: | Size: 679 B |
Binary file not shown.
Before Width: | Height: | Size: 734 B |
|
@ -1,290 +0,0 @@
|
|||
.. _plugins_index:
|
||||
|
||||
List of Third-Party Plugins
|
||||
===========================
|
||||
|
||||
The table below contains a listing of plugins found in PyPI and
|
||||
their status when tested when using latest py.test and python versions.
|
||||
|
||||
A complete listing can also be found at
|
||||
`plugincompat <http://plugincompat.herokuapp.com/>`_, which contains tests
|
||||
status against other py.test releases.
|
||||
|
||||
|
||||
============================================================================================ ===================================================================================================================== ===================================================================================================================== =========================================================================== =============================================================================================================================================
|
||||
Name Py27 Py34 Home Summary
|
||||
============================================================================================ ===================================================================================================================== ===================================================================================================================== =========================================================================== =============================================================================================================================================
|
||||
`pytest-allure-adaptor <http://pypi.python.org/pypi/pytest-allure-adaptor>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-allure-adaptor-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-allure-adaptor-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Plugin for py.test to generate allure xml reports
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-allure-adaptor-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-allure-adaptor-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/allure-framework/allure-python
|
||||
`pytest-ansible <http://pypi.python.org/pypi/pytest-ansible>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-ansible-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-ansible-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Plugin for py.test to allow running ansible
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-ansible-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-ansible-latest?py=py34&pytest=2.8.0.dev4 :target: http://github.com/jlaska/pytest-ansible
|
||||
`pytest-asyncio <http://pypi.python.org/pypi/pytest-asyncio>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-asyncio-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-asyncio-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Pytest support for asyncio.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-asyncio-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-asyncio-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/pytest-dev/pytest-asyncio
|
||||
`pytest-autochecklog <http://pypi.python.org/pypi/pytest-autochecklog>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-autochecklog-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-autochecklog-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png automatically check condition and log all the checks
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-autochecklog-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-autochecklog-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/steven004/python-autochecklog
|
||||
`pytest-bdd <http://pypi.python.org/pypi/pytest-bdd>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-bdd-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-bdd-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png BDD for pytest
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-bdd-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-bdd-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/pytest-dev/pytest-bdd
|
||||
`pytest-beakerlib <http://pypi.python.org/pypi/pytest-beakerlib>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-beakerlib-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-beakerlib-latest?py=py34&pytest=2.8.0.dev4 `link <https://fedorahosted.org/python-pytest-beakerlib/>`_ A pytest plugin that reports test results to the BeakerLib framework
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-beakerlib-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-beakerlib-latest?py=py34&pytest=2.8.0.dev4
|
||||
`pytest-beds <http://pypi.python.org/pypi/pytest-beds>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-beds-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-beds-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Fixtures for testing Google Appengine (GAE) apps
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-beds-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-beds-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/kaste/pytest-beds
|
||||
`pytest-bench <http://pypi.python.org/pypi/pytest-bench>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-bench-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-bench-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Benchmark utility that plugs into pytest.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-bench-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-bench-latest?py=py34&pytest=2.8.0.dev4 :target: http://github.com/concordusapps/pytest-bench
|
||||
`pytest-benchmark <http://pypi.python.org/pypi/pytest-benchmark>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-benchmark-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-benchmark-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png py.test fixture for benchmarking code
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-benchmark-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-benchmark-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/ionelmc/pytest-benchmark
|
||||
`pytest-blockage <http://pypi.python.org/pypi/pytest-blockage>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-blockage-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-blockage-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Disable network requests during a test run.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-blockage-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-blockage-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/rob-b/pytest-blockage
|
||||
`pytest-bpdb <http://pypi.python.org/pypi/pytest-bpdb>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-bpdb-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-bpdb-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png A py.test plug-in to enable drop to bpdb debugger on test failure.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-bpdb-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-bpdb-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/slafs/pytest-bpdb
|
||||
`pytest-browsermob-proxy <http://pypi.python.org/pypi/pytest-browsermob-proxy>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-browsermob-proxy-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-browsermob-proxy-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png BrowserMob proxy plugin for py.test.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-browsermob-proxy-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-browsermob-proxy-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/davehunt/pytest-browsermob-proxy
|
||||
`pytest-bugzilla <http://pypi.python.org/pypi/pytest-bugzilla>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-bugzilla-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-bugzilla-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png py.test bugzilla integration plugin
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-bugzilla-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-bugzilla-latest?py=py34&pytest=2.8.0.dev4 :target: http://github.com/nibrahim/pytest_bugzilla
|
||||
`pytest-marker-bugzilla <http://pypi.python.org/pypi/pytest-marker-bugzilla>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-marker-bugzilla-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-marker-bugzilla-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png py.test bugzilla integration plugin, using markers
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-marker-bugzilla-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-marker-bugzilla-latest?py=py34&pytest=2.8.0.dev4 :target: http://github.com/eanxgeek/pytest_marker_bugzilla
|
||||
`pytest-remove-stale-bytecode <http://pypi.python.org/pypi/pytest-remove-stale-bytecode>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-remove-stale-bytecode-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-remove-stale-bytecode-latest?py=py34&pytest=2.8.0.dev4 .. image:: bitbucket.png py.test plugin to remove stale byte code files.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-remove-stale-bytecode-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-remove-stale-bytecode-latest?py=py34&pytest=2.8.0.dev4 :target: https://bitbucket.org/gocept/pytest-remove-stale-bytecode/
|
||||
`pytest-cache <http://pypi.python.org/pypi/pytest-cache>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-cache-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-cache-latest?py=py34&pytest=2.8.0.dev4 .. image:: bitbucket.png pytest plugin with mechanisms for caching across test runs
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-cache-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-cache-latest?py=py34&pytest=2.8.0.dev4 :target: http://bitbucket.org/hpk42/pytest-cache/
|
||||
`pytest-cagoule <http://pypi.python.org/pypi/pytest-cagoule>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-cagoule-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-cagoule-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Pytest plugin to only run tests affected by changes
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-cagoule-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-cagoule-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/davidszotten/pytest-cagoule
|
||||
`pytest-capturelog <http://pypi.python.org/pypi/pytest-capturelog>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-capturelog-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-capturelog-latest?py=py34&pytest=2.8.0.dev4 .. image:: bitbucket.png py.test plugin to capture log messages
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-capturelog-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-capturelog-latest?py=py34&pytest=2.8.0.dev4 :target: http://bitbucket.org/memedough/pytest-capturelog/overview
|
||||
`pytest-django-casperjs <http://pypi.python.org/pypi/pytest-django-casperjs>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-django-casperjs-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-django-casperjs-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Integrate CasperJS with your django tests as a pytest fixture.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-django-casperjs-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-django-casperjs-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/EnTeQuAk/pytest-django-casperjs/
|
||||
`pytest-catchlog <http://pypi.python.org/pypi/pytest-catchlog>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-catchlog-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-catchlog-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png py.test plugin to catch log messages. This is a fork of pytest-capturelog.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-catchlog-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-catchlog-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/eisensheng/pytest-catchlog
|
||||
`pytest-circleci <http://pypi.python.org/pypi/pytest-circleci>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-circleci-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-circleci-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png py.test plugin for CircleCI
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-circleci-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-circleci-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/micktwomey/pytest-circleci
|
||||
`pytest-cloud <http://pypi.python.org/pypi/pytest-cloud>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-cloud-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-cloud-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Distributed tests planner plugin for pytest testing framework.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-cloud-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-cloud-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/pytest-dev/pytest-cloud
|
||||
`pytest-codecheckers <http://pypi.python.org/pypi/pytest-codecheckers>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-codecheckers-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-codecheckers-latest?py=py34&pytest=2.8.0.dev4 .. image:: bitbucket.png pytest plugin to add source code sanity checks (pep8 and friends)
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-codecheckers-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-codecheckers-latest?py=py34&pytest=2.8.0.dev4 :target: http://bitbucket.org/RonnyPfannschmidt/pytest-codecheckers/
|
||||
`pytest-colordots <http://pypi.python.org/pypi/pytest-colordots>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-colordots-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-colordots-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Colorizes the progress indicators
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-colordots-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-colordots-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/svenstaro/pytest-colordots
|
||||
`pytest-paste-config <http://pypi.python.org/pypi/pytest-paste-config>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-paste-config-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-paste-config-latest?py=py34&pytest=2.8.0.dev4 ? Allow setting the path to a paste config file
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-paste-config-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-paste-config-latest?py=py34&pytest=2.8.0.dev4
|
||||
`pytest-config <http://pypi.python.org/pypi/pytest-config>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-config-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-config-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Base configurations and utilities for developing your Python project test suite with pytest.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-config-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-config-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/buzzfeed/pytest_config
|
||||
`pytest-contextfixture <http://pypi.python.org/pypi/pytest-contextfixture>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-contextfixture-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-contextfixture-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Define pytest fixtures as context managers.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-contextfixture-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-contextfixture-latest?py=py34&pytest=2.8.0.dev4 :target: http://github.com/pelme/pytest-contextfixture/
|
||||
`pytest-couchdbkit <http://pypi.python.org/pypi/pytest-couchdbkit>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-couchdbkit-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-couchdbkit-latest?py=py34&pytest=2.8.0.dev4 .. image:: bitbucket.png py.test extension for per-test couchdb databases using couchdbkit
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-couchdbkit-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-couchdbkit-latest?py=py34&pytest=2.8.0.dev4 :target: http://bitbucket.org/RonnyPfannschmidt/pytest-couchdbkit
|
||||
`pytest-cov <http://pypi.python.org/pypi/pytest-cov>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-cov-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-cov-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png py.test plugin for coverage reporting with support for both centralised and distributed testing, including subprocesses and multiprocessing
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-cov-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-cov-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/schlamar/pytest-cov
|
||||
`pytest-cover <http://pypi.python.org/pypi/pytest-cover>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-cover-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-cover-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Pytest plugin for measuring coverage. Forked from `pytest-cov`.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-cover-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-cover-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/ionelmc/pytest-cover
|
||||
`pytest-coverage <http://pypi.python.org/pypi/pytest-coverage>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-coverage-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-coverage-latest?py=py34&pytest=2.8.0.dev4 `link <https://pypi.python.org/pypi/pytest-cover/>`_ Pytest plugin for measuring coverage. Forked from `pytest-cov`.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-coverage-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-coverage-latest?py=py34&pytest=2.8.0.dev4
|
||||
`pytest-cpp <http://pypi.python.org/pypi/pytest-cpp>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-cpp-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-cpp-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Use pytest's runner to discover and execute C++ tests
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-cpp-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-cpp-latest?py=py34&pytest=2.8.0.dev4 :target: http://github.com/pytest-dev/pytest-cpp
|
||||
`pytest-curl-report <http://pypi.python.org/pypi/pytest-curl-report>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-curl-report-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-curl-report-latest?py=py34&pytest=2.8.0.dev4 .. image:: bitbucket.png pytest plugin to generate curl command line report
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-curl-report-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-curl-report-latest?py=py34&pytest=2.8.0.dev4 :target: https://bitbucket.org/pytest-dev/pytest-curl-report
|
||||
`pytest-dbfixtures <http://pypi.python.org/pypi/pytest-dbfixtures>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-dbfixtures-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-dbfixtures-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Databases fixtures plugin for py.test.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-dbfixtures-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-dbfixtures-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/ClearcodeHQ/pytest-dbfixtures
|
||||
`pytest-dbus-notification <http://pypi.python.org/pypi/pytest-dbus-notification>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-dbus-notification-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-dbus-notification-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png D-BUS notifications for pytest results.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-dbus-notification-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-dbus-notification-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/bmathieu33/pytest-dbus-notification
|
||||
`pytest-describe <http://pypi.python.org/pypi/pytest-describe>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-describe-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-describe-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Describe-style plugin for pytest
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-describe-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-describe-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/ropez/pytest-describe
|
||||
`pytest-diffeo <http://pypi.python.org/pypi/pytest-diffeo>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-diffeo-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-diffeo-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Common py.test support for Diffeo packages
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-diffeo-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-diffeo-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/diffeo/pytest-diffeo
|
||||
`pytest-django-sqlcounts <http://pypi.python.org/pypi/pytest-django-sqlcounts>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-django-sqlcounts-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-django-sqlcounts-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png py.test plugin for reporting the number of SQLs executed per django testcase.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-django-sqlcounts-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-django-sqlcounts-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/stj/pytest-django-sqlcount
|
||||
`pytest-django-sqlcount <http://pypi.python.org/pypi/pytest-django-sqlcount>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-django-sqlcount-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-django-sqlcount-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png py.test plugin for reporting the number of SQLs executed per django testcase.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-django-sqlcount-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-django-sqlcount-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/stj/pytest-django-sqlcount
|
||||
`pytest-django-haystack <http://pypi.python.org/pypi/pytest-django-haystack>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-django-haystack-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-django-haystack-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Cleanup your Haystack indexes between tests
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-django-haystack-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-django-haystack-latest?py=py34&pytest=2.8.0.dev4 :target: http://github.com/rouge8/pytest-django-haystack
|
||||
`pytest-django-lite <http://pypi.python.org/pypi/pytest-django-lite>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-django-lite-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-django-lite-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png The bare minimum to integrate py.test with Django.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-django-lite-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-django-lite-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/dcramer/pytest-django-lite
|
||||
`pytest-django <http://pypi.python.org/pypi/pytest-django>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-django-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-django-latest?py=py34&pytest=2.8.0.dev4 `link <http://pytest-django.readthedocs.org/>`_ A Django plugin for py.test.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-django-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-django-latest?py=py34&pytest=2.8.0.dev4
|
||||
`pytest-doc <http://pypi.python.org/pypi/pytest-doc>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-doc-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-doc-latest?py=py34&pytest=2.8.0.dev4 `link <http://pytest-doc.readthedocs.org/>`_ A documentation plugin for py.test.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-doc-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-doc-latest?py=py34&pytest=2.8.0.dev4
|
||||
`pytest-dump2json <http://pypi.python.org/pypi/pytest-dump2json>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-dump2json-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-dump2json-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png A pytest plugin for dumping test results to json.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-dump2json-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-dump2json-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/d6e/pytest-dump2json
|
||||
`pytest-echo <http://pypi.python.org/pypi/pytest-echo>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-echo-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-echo-latest?py=py34&pytest=2.8.0.dev4 `link <http://pypi.python.org/pypi/pytest-echo/>`_ pytest plugin with mechanisms for echoing environment variables, package version and generic attributes
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-echo-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-echo-latest?py=py34&pytest=2.8.0.dev4
|
||||
`pytest-env <http://pypi.python.org/pypi/pytest-env>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-env-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-env-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png py.test plugin that allows you to add environment variables.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-env-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-env-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/MobileDynasty/pytest-env
|
||||
`pytest-eradicate <http://pypi.python.org/pypi/pytest-eradicate>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-eradicate-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-eradicate-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png pytest plugin to check for commented out code
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-eradicate-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-eradicate-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/spil-johan/pytest-eradicate
|
||||
`pytest-factoryboy <http://pypi.python.org/pypi/pytest-factoryboy>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-factoryboy-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-factoryboy-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Factory Boy support for pytest.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-factoryboy-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-factoryboy-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/pytest-dev/pytest-factoryboy
|
||||
`pytest-poo-fail <http://pypi.python.org/pypi/pytest-poo-fail>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-poo-fail-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-poo-fail-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Visualize your failed tests with poo
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-poo-fail-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-poo-fail-latest?py=py34&pytest=2.8.0.dev4 :target: http://github.com/alyssa.barela/pytest-poo-fail
|
||||
`pytest-faker <http://pypi.python.org/pypi/pytest-faker>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-faker-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-faker-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Faker integration for pytest framework.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-faker-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-faker-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/pytest-dev/pytest-faker
|
||||
`pytest-faulthandler <http://pypi.python.org/pypi/pytest-faulthandler>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-faulthandler-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-faulthandler-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png py.test plugin that activates the fault handler module for tests
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-faulthandler-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-faulthandler-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/pytest-dev/pytest-faulthandler
|
||||
`pytest-fauxfactory <http://pypi.python.org/pypi/pytest-fauxfactory>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-fauxfactory-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-fauxfactory-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Integration of fauxfactory into pytest.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-fauxfactory-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-fauxfactory-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/mfalesni/pytest-fauxfactory
|
||||
`pytest-figleaf <http://pypi.python.org/pypi/pytest-figleaf>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-figleaf-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-figleaf-latest?py=py34&pytest=2.8.0.dev4 .. image:: bitbucket.png py.test figleaf coverage plugin
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-figleaf-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-figleaf-latest?py=py34&pytest=2.8.0.dev4 :target: http://bitbucket.org/hpk42/pytest-figleaf
|
||||
`pytest-fixture-tools <http://pypi.python.org/pypi/pytest-fixture-tools>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-fixture-tools-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-fixture-tools-latest?py=py34&pytest=2.8.0.dev4 ? Plugin for pytest which provides tools for fixtures
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-fixture-tools-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-fixture-tools-latest?py=py34&pytest=2.8.0.dev4
|
||||
`pytest-flake8 <http://pypi.python.org/pypi/pytest-flake8>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-flake8-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-flake8-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png pytest plugin to check FLAKE8 requirements
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-flake8-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-flake8-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/tholo/pytest-flake8
|
||||
`pytest-flakes <http://pypi.python.org/pypi/pytest-flakes>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-flakes-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-flakes-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png pytest plugin to check source code with pyflakes
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-flakes-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-flakes-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/fschulze/pytest-flakes
|
||||
`pytest-flask <http://pypi.python.org/pypi/pytest-flask>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-flask-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-flask-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png A set of py.test fixtures to test Flask applications.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-flask-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-flask-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/vitalk/pytest-flask
|
||||
`pytest-gitignore <http://pypi.python.org/pypi/pytest-gitignore>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-gitignore-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-gitignore-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png py.test plugin to ignore the same files as git
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-gitignore-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-gitignore-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/tgs/pytest-gitignore
|
||||
`pytest-greendots <http://pypi.python.org/pypi/pytest-greendots>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-greendots-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-greendots-latest?py=py34&pytest=2.8.0.dev4 ? Green progress dots
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-greendots-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-greendots-latest?py=py34&pytest=2.8.0.dev4
|
||||
`pytest-growl <http://pypi.python.org/pypi/pytest-growl>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-growl-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-growl-latest?py=py34&pytest=2.8.0.dev4 ? Growl notifications for pytest results.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-growl-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-growl-latest?py=py34&pytest=2.8.0.dev4
|
||||
`pytest-html <http://pypi.python.org/pypi/pytest-html>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-html-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-html-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png pytest plugin for generating HTML reports
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-html-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-html-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/davehunt/pytest-html
|
||||
`pytest-httpbin <http://pypi.python.org/pypi/pytest-httpbin>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-httpbin-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-httpbin-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Easily test your HTTP library against a local copy of httpbin
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-httpbin-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-httpbin-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/kevin1024/pytest-httpbin
|
||||
`pytest-httpretty <http://pypi.python.org/pypi/pytest-httpretty>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-httpretty-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-httpretty-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png A thin wrapper of HTTPretty for pytest
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-httpretty-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-httpretty-latest?py=py34&pytest=2.8.0.dev4 :target: http://github.com/papaeye/pytest-httpretty
|
||||
`pytest-incremental <http://pypi.python.org/pypi/pytest-incremental>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-incremental-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-incremental-latest?py=py34&pytest=2.8.0.dev4 `link <http://pytest-incremental.readthedocs.org>`_ an incremental test runner (pytest plugin)
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-incremental-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-incremental-latest?py=py34&pytest=2.8.0.dev4
|
||||
`pytest-instafail <http://pypi.python.org/pypi/pytest-instafail>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-instafail-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-instafail-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png py.test plugin to show failures instantly
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-instafail-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-instafail-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/jpvanhal/pytest-instafail
|
||||
`pytest-ipdb <http://pypi.python.org/pypi/pytest-ipdb>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-ipdb-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-ipdb-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png A py.test plug-in to enable drop to ipdb debugger on test failure.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-ipdb-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-ipdb-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/mverteuil/pytest-ipdb
|
||||
`pytest-ipynb <http://pypi.python.org/pypi/pytest-ipynb>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-ipynb-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-ipynb-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Use pytest's runner to discover and execute tests as cells of IPython notebooks
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-ipynb-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-ipynb-latest?py=py34&pytest=2.8.0.dev4 :target: http://github.com/zonca/pytest-ipynb
|
||||
`pytest-isort <http://pypi.python.org/pypi/pytest-isort>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-isort-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-isort-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png pytest plugin to perform isort checks (import ordering)
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-isort-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-isort-latest?py=py34&pytest=2.8.0.dev4 :target: http://github.com/moccu/pytest-isort/
|
||||
`pytest-jira <http://pypi.python.org/pypi/pytest-jira>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-jira-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-jira-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png py.test JIRA integration plugin, using markers
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-jira-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-jira-latest?py=py34&pytest=2.8.0.dev4 :target: http://github.com/jlaska/pytest_jira
|
||||
`pytest-knows <http://pypi.python.org/pypi/pytest-knows>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-knows-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-knows-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png A pytest plugin that can automaticly skip test case based on dependence info calculated by trace
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-knows-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-knows-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/mapix/ptknows
|
||||
`pytest-konira <http://pypi.python.org/pypi/pytest-konira>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-konira-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-konira-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Run Konira DSL tests with py.test
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-konira-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-konira-latest?py=py34&pytest=2.8.0.dev4 :target: http://github.com/alfredodeza/pytest-konira
|
||||
`pytest-localserver <http://pypi.python.org/pypi/pytest-localserver>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-localserver-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-localserver-latest?py=py34&pytest=2.8.0.dev4 .. image:: bitbucket.png py.test plugin to test server connections locally.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-localserver-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-localserver-latest?py=py34&pytest=2.8.0.dev4 :target: http://bitbucket.org/basti/pytest-localserver/
|
||||
`pytest-markfiltration <http://pypi.python.org/pypi/pytest-markfiltration>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-markfiltration-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-markfiltration-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png UNKNOWN
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-markfiltration-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-markfiltration-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/adamgoucher/pytest-markfiltration
|
||||
`pytest-marks <http://pypi.python.org/pypi/pytest-marks>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-marks-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-marks-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png UNKNOWN
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-marks-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-marks-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/adamgoucher/pytest-marks
|
||||
`pytest-mccabe <http://pypi.python.org/pypi/pytest-mccabe>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-mccabe-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-mccabe-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png pytest plugin to run the mccabe code complexity checker.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-mccabe-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-mccabe-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/The-Compiler/pytest-mccabe
|
||||
`pytest-mock <http://pypi.python.org/pypi/pytest-mock>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-mock-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-mock-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Thin-wrapper around the mock package for easier use with py.test
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-mock-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-mock-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/pytest-dev/pytest-mock/
|
||||
`pytest-monkeyplus <http://pypi.python.org/pypi/pytest-monkeyplus>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-monkeyplus-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-monkeyplus-latest?py=py34&pytest=2.8.0.dev4 .. image:: bitbucket.png pytest's monkeypatch subclass with extra functionalities
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-monkeyplus-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-monkeyplus-latest?py=py34&pytest=2.8.0.dev4 :target: http://bitbucket.org/hsoft/pytest-monkeyplus/
|
||||
`pytest-mozwebqa <http://pypi.python.org/pypi/pytest-mozwebqa>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-mozwebqa-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-mozwebqa-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Mozilla WebQA plugin for py.test.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-mozwebqa-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-mozwebqa-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/mozilla/pytest-mozwebqa
|
||||
`pytest-mpl <http://pypi.python.org/pypi/pytest-mpl>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-mpl-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-mpl-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png pytest plugin to help with testing figures output from Matplotlib
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-mpl-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-mpl-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/astrofrog/pytest-mpl
|
||||
`pytest-multihost <http://pypi.python.org/pypi/pytest-multihost>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-multihost-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-multihost-latest?py=py34&pytest=2.8.0.dev4 `link <https://fedorahosted.org/python-pytest-multihost/>`_ Utility for writing multi-host tests for pytest
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-multihost-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-multihost-latest?py=py34&pytest=2.8.0.dev4
|
||||
`pytest-oerp <http://pypi.python.org/pypi/pytest-oerp>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-oerp-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-oerp-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png pytest plugin to test OpenERP modules
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-oerp-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-oerp-latest?py=py34&pytest=2.8.0.dev4 :target: http://github.com/santagada/pytest-oerp/
|
||||
`pytest-oot <http://pypi.python.org/pypi/pytest-oot>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-oot-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-oot-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Run object-oriented tests in a simple format
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-oot-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-oot-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/steven004/pytest_oot
|
||||
`pytest-optional <http://pypi.python.org/pypi/pytest-optional>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-optional-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-optional-latest?py=py34&pytest=2.8.0.dev4 .. image:: bitbucket.png include/exclude values of fixtures in pytest
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-optional-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-optional-latest?py=py34&pytest=2.8.0.dev4 :target: http://bitbucket.org/maho/pytest-optional
|
||||
`pytest-ordering <http://pypi.python.org/pypi/pytest-ordering>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-ordering-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-ordering-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png pytest plugin to run your tests in a specific order
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-ordering-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-ordering-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/ftobia/pytest-ordering
|
||||
`pytest-osxnotify <http://pypi.python.org/pypi/pytest-osxnotify>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-osxnotify-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-osxnotify-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png OS X notifications for py.test results.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-osxnotify-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-osxnotify-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/dbader/pytest-osxnotify
|
||||
`pytest-pep257 <http://pypi.python.org/pypi/pytest-pep257>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-pep257-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-pep257-latest?py=py34&pytest=2.8.0.dev4 ? py.test plugin for pep257
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-pep257-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-pep257-latest?py=py34&pytest=2.8.0.dev4
|
||||
`pytest-pep8 <http://pypi.python.org/pypi/pytest-pep8>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-pep8-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-pep8-latest?py=py34&pytest=2.8.0.dev4 .. image:: bitbucket.png pytest plugin to check PEP8 requirements
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-pep8-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-pep8-latest?py=py34&pytest=2.8.0.dev4 :target: http://bitbucket.org/hpk42/pytest-pep8/
|
||||
`pytest-pipeline <http://pypi.python.org/pypi/pytest-pipeline>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-pipeline-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-pipeline-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Pytest plugin for functional testing of data analysis pipelines
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-pipeline-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-pipeline-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/bow/pytest-pipeline
|
||||
`pytest-poo <http://pypi.python.org/pypi/pytest-poo>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-poo-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-poo-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Visualize your crappy tests
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-poo-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-poo-latest?py=py34&pytest=2.8.0.dev4 :target: http://github.com/pelme/pytest-poo
|
||||
`pytest-proper-wheel <http://pypi.python.org/pypi/pytest-proper-wheel>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-proper-wheel-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-proper-wheel-latest?py=py34&pytest=2.8.0.dev4 `link <http://pytest.org>`_ pytest: simple powerful testing with Python
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-proper-wheel-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-proper-wheel-latest?py=py34&pytest=2.8.0.dev4
|
||||
`pytest-purkinje <http://pypi.python.org/pypi/pytest-purkinje>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-purkinje-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-purkinje-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png py.test plugin for purkinje test runner
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-purkinje-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-purkinje-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/bbiskup
|
||||
`pytest-pycharm <http://pypi.python.org/pypi/pytest-pycharm>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-pycharm-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-pycharm-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Plugin for py.test to enter PyCharm debugger on uncaught exceptions
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-pycharm-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-pycharm-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/jlubcke/pytest-pycharm
|
||||
`pytest-pydev <http://pypi.python.org/pypi/pytest-pydev>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-pydev-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-pydev-latest?py=py34&pytest=2.8.0.dev4 .. image:: bitbucket.png py.test plugin to connect to a remote debug server with PyDev or PyCharm.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-pydev-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-pydev-latest?py=py34&pytest=2.8.0.dev4 :target: http://bitbucket.org/basti/pytest-pydev/
|
||||
`pytest-pylint <http://pypi.python.org/pypi/pytest-pylint>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-pylint-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-pylint-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png pytest plugin to check source code with pylint
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-pylint-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-pylint-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/carsongee/pytest-pylint
|
||||
`pytest-pyq <http://pypi.python.org/pypi/pytest-pyq>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-pyq-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-pyq-latest?py=py34&pytest=2.8.0.dev4 `link <http://pyq.enlnt.com>`_ Pytest fixture "q" for pyq
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-pyq-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-pyq-latest?py=py34&pytest=2.8.0.dev4
|
||||
`pytest-sftpserver <http://pypi.python.org/pypi/pytest-sftpserver>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-sftpserver-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-sftpserver-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png py.test plugin to locally test sftp server connections.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-sftpserver-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-sftpserver-latest?py=py34&pytest=2.8.0.dev4 :target: http://github.com/ulope/pytest-sftpserver/
|
||||
`pytest-rage <http://pypi.python.org/pypi/pytest-rage>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-rage-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-rage-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png pytest plugin to implement PEP712
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-rage-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-rage-latest?py=py34&pytest=2.8.0.dev4 :target: http://github.com/santagada/pytest-rage/
|
||||
`pytest-smartcov <http://pypi.python.org/pypi/pytest-smartcov>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-smartcov-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-smartcov-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Smart coverage plugin for pytest.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-smartcov-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-smartcov-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/carljm/pytest-smartcov/
|
||||
`pytest-variables <http://pypi.python.org/pypi/pytest-variables>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-variables-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-variables-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png pytest plugin for providing variables to tests/fixtures
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-variables-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-variables-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/davehunt/pytest-variables
|
||||
`pytest-selenium <http://pypi.python.org/pypi/pytest-selenium>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-selenium-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-selenium-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png A selenium plugin for pytest
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-selenium-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-selenium-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/codingjoe/pytest-selenium
|
||||
`pytest-readme <http://pypi.python.org/pypi/pytest-readme>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-readme-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-readme-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Test your README.md file
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-readme-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-readme-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/boxed/pytest-readme
|
||||
`pytest-translations <http://pypi.python.org/pypi/pytest-translations>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-translations-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-translations-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Test your translation files
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-translations-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-translations-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/thermondo/pytest-translations
|
||||
`pytest-xprocess <http://pypi.python.org/pypi/pytest-xprocess>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-xprocess-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-xprocess-latest?py=py34&pytest=2.8.0.dev4 .. image:: bitbucket.png pytest plugin to manage external processes across test runs
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-xprocess-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-xprocess-latest?py=py34&pytest=2.8.0.dev4 :target: http://bitbucket.org/hpk42/pytest-xprocess/
|
||||
`pytest-random <http://pypi.python.org/pypi/pytest-random>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-random-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-random-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png py.test plugin to randomize tests
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-random-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-random-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/klrmn/pytest-random
|
||||
`pytest-sourceorder <http://pypi.python.org/pypi/pytest-sourceorder>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-sourceorder-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-sourceorder-latest?py=py34&pytest=2.8.0.dev4 `link <https://fedorahosted.org/python-pytest-sourceorder/>`_ Test-ordering plugin for pytest
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-sourceorder-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-sourceorder-latest?py=py34&pytest=2.8.0.dev4
|
||||
`pytest-zap <http://pypi.python.org/pypi/pytest-zap>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-zap-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-zap-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png OWASP ZAP plugin for py.test.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-zap-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-zap-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/davehunt/pytest-zap
|
||||
`pytest-raisesregexp <http://pypi.python.org/pypi/pytest-raisesregexp>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-raisesregexp-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-raisesregexp-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Simple pytest plugin to look for regex in Exceptions
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-raisesregexp-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-raisesregexp-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/Walkman/pytest_raisesregexp
|
||||
`pytest-trialtemp <http://pypi.python.org/pypi/pytest-trialtemp>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-trialtemp-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-trialtemp-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png py.test plugin for using the same _trial_temp working directory as trial
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-trialtemp-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-trialtemp-latest?py=py34&pytest=2.8.0.dev4 :target: http://github.com/jerith/pytest-trialtemp
|
||||
`pytest-sftpserver <http://pypi.python.org/pypi/pytest-sftpserver>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-sftpserver-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-sftpserver-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png py.test plugin to locally test sftp server connections.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-sftpserver-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-sftpserver-latest?py=py34&pytest=2.8.0.dev4 :target: http://github.com/ulope/pytest-sftpserver/
|
||||
`pytest-rerunfailures <http://pypi.python.org/pypi/pytest-rerunfailures>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-rerunfailures-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-rerunfailures-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png py.test plugin to re-run tests to eliminate flakey failures
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-rerunfailures-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-rerunfailures-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/klrmn/pytest-rerunfailures
|
||||
`pytest-spec <http://pypi.python.org/pypi/pytest-spec>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-spec-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-spec-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png pytest plugin to display test execution output like a SPECIFICATION
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-spec-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-spec-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/pchomik/pytest-spec
|
||||
`pytest-testmon <http://pypi.python.org/pypi/pytest-testmon>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-testmon-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-testmon-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png take TDD to a new level with py.test and testmon
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-testmon-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-testmon-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/tarpas/pytest-testmon/
|
||||
`pytest-sftpserver <http://pypi.python.org/pypi/pytest-sftpserver>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-sftpserver-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-sftpserver-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png py.test plugin to locally test sftp server connections.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-sftpserver-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-sftpserver-latest?py=py34&pytest=2.8.0.dev4 :target: http://github.com/ulope/pytest-sftpserver/
|
||||
`pytest-stepwise <http://pypi.python.org/pypi/pytest-stepwise>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-stepwise-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-stepwise-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Run a test suite one failing test at a time.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-stepwise-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-stepwise-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/nip3o/pytest-stepwise
|
||||
`pytest-runfailed <http://pypi.python.org/pypi/pytest-runfailed>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-runfailed-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-runfailed-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png implement a --failed option for pytest
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-runfailed-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-runfailed-latest?py=py34&pytest=2.8.0.dev4 :target: http://github.com/dmerejkowsky/pytest-runfailed
|
||||
`pytest-tornado <http://pypi.python.org/pypi/pytest-tornado>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-tornado-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-tornado-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png A py.test plugin providing fixtures and markers to simplify testing of asynchronous tornado applications.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-tornado-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-tornado-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/eugeniy/pytest-tornado
|
||||
`pytest-sftpserver <http://pypi.python.org/pypi/pytest-sftpserver>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-sftpserver-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-sftpserver-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png py.test plugin to locally test sftp server connections.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-sftpserver-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-sftpserver-latest?py=py34&pytest=2.8.0.dev4 :target: http://github.com/ulope/pytest-sftpserver/
|
||||
`pytest-timeout <http://pypi.python.org/pypi/pytest-timeout>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-timeout-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-timeout-latest?py=py34&pytest=2.8.0.dev4 .. image:: bitbucket.png py.test plugin to abort hanging tests
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-timeout-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-timeout-latest?py=py34&pytest=2.8.0.dev4 :target: http://bitbucket.org/flub/pytest-timeout/
|
||||
`pytest-sftpserver <http://pypi.python.org/pypi/pytest-sftpserver>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-sftpserver-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-sftpserver-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png py.test plugin to locally test sftp server connections.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-sftpserver-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-sftpserver-latest?py=py34&pytest=2.8.0.dev4 :target: http://github.com/ulope/pytest-sftpserver/
|
||||
`pytest-ubersmith <http://pypi.python.org/pypi/pytest-ubersmith>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-ubersmith-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-ubersmith-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Easily mock calls to ubersmith at the `requests` level.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-ubersmith-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-ubersmith-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/hivelocity/pytest-ubersmith
|
||||
`pytest-services <http://pypi.python.org/pypi/pytest-services>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-services-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-services-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Services plugin for pytest testing framework
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-services-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-services-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/pytest-dev/pytest-services
|
||||
`pytest-pythonpath <http://pypi.python.org/pypi/pytest-pythonpath>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-pythonpath-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-pythonpath-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png pytest plugin for adding to the PYTHONPATH from command line or configs.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-pythonpath-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-pythonpath-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/bigsassy/pytest-pythonpath
|
||||
`pytest-yamlwsgi <http://pypi.python.org/pypi/pytest-yamlwsgi>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-yamlwsgi-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-yamlwsgi-latest?py=py34&pytest=2.8.0.dev4 ? Run tests against wsgi apps defined in yaml
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-yamlwsgi-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-yamlwsgi-latest?py=py34&pytest=2.8.0.dev4
|
||||
`pytest-trello <http://pypi.python.org/pypi/pytest-trello>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-trello-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-trello-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Plugin for py.test that integrates trello using markers
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-trello-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-trello-latest?py=py34&pytest=2.8.0.dev4 :target: http://github.com/jlaska/pytest-trello
|
||||
`pytest-quickcheck <http://pypi.python.org/pypi/pytest-quickcheck>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-quickcheck-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-quickcheck-latest?py=py34&pytest=2.8.0.dev4 .. image:: bitbucket.png pytest plugin to generate random data inspired by QuickCheck
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-quickcheck-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-quickcheck-latest?py=py34&pytest=2.8.0.dev4 :target: https://bitbucket.org/pytest-dev/pytest-quickcheck
|
||||
`pytest-twisted <http://pypi.python.org/pypi/pytest-twisted>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-twisted-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-twisted-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png A twisted plugin for py.test.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-twisted-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-twisted-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/schmir/pytest-twisted
|
||||
`pytest-sftpserver <http://pypi.python.org/pypi/pytest-sftpserver>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-sftpserver-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-sftpserver-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png py.test plugin to locally test sftp server connections.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-sftpserver-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-sftpserver-latest?py=py34&pytest=2.8.0.dev4 :target: http://github.com/ulope/pytest-sftpserver/
|
||||
`pytest-watch <http://pypi.python.org/pypi/pytest-watch>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-watch-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-watch-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Local continuous test runner with pytest and watchdog.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-watch-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-watch-latest?py=py34&pytest=2.8.0.dev4 :target: http://github.com/joeyespo/pytest-watch
|
||||
`pytest-unmarked <http://pypi.python.org/pypi/pytest-unmarked>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-unmarked-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-unmarked-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Run only unmarked tests
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-unmarked-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-unmarked-latest?py=py34&pytest=2.8.0.dev4 :target: http://github.com/alyssa.barela/pytest-unmarked
|
||||
`pytest-regtest <http://pypi.python.org/pypi/pytest-regtest>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-regtest-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-regtest-latest?py=py34&pytest=2.8.0.dev4 `link <https://sissource.ethz.ch/uweschmitt/pytest-regtest/tree/master>`_ py.test plugin for regression tests
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-regtest-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-regtest-latest?py=py34&pytest=2.8.0.dev4
|
||||
`pytest-xdist <http://pypi.python.org/pypi/pytest-xdist>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-xdist-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-xdist-latest?py=py34&pytest=2.8.0.dev4 .. image:: bitbucket.png py.test xdist plugin for distributed testing and loop-on-failing modes
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-xdist-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-xdist-latest?py=py34&pytest=2.8.0.dev4 :target: http://bitbucket.org/hpk42/pytest-xdist
|
||||
`pytest-sugar <http://pypi.python.org/pypi/pytest-sugar>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-sugar-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-sugar-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png py.test is a plugin for py.test that changes the default look and feel of py.test (e.g. progressbar, show tests that fail instantly).
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-sugar-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-sugar-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/Frozenball/pytest-sugar
|
||||
`pytest-qt <http://pypi.python.org/pypi/pytest-qt>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-qt-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-qt-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png pytest support for PyQt and PySide applications
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-qt-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-qt-latest?py=py34&pytest=2.8.0.dev4 :target: http://github.com/pytest-dev/pytest-qt
|
||||
`pytest-runner <http://pypi.python.org/pypi/pytest-runner>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-runner-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-runner-latest?py=py34&pytest=2.8.0.dev4 .. image:: bitbucket.png Invoke py.test as distutils command with dependency resolution.
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-runner-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-runner-latest?py=py34&pytest=2.8.0.dev4 :target: https://bitbucket.org/pytest-dev/pytest-runner
|
||||
`pytest-splinter <http://pypi.python.org/pypi/pytest-splinter>`_ .. image:: http://plugincompat.herokuapp.com/status/pytest-splinter-latest?py=py27&pytest=2.8.0.dev4 .. image:: http://plugincompat.herokuapp.com/status/pytest-splinter-latest?py=py34&pytest=2.8.0.dev4 .. image:: github.png Splinter plugin for pytest testing framework
|
||||
:target: http://plugincompat.herokuapp.com/output/pytest-splinter-latest?py=py27&pytest=2.8.0.dev4 :target: http://plugincompat.herokuapp.com/output/pytest-splinter-latest?py=py34&pytest=2.8.0.dev4 :target: https://github.com/pytest-dev/pytest-splinter
|
||||
|
||||
============================================================================================ ===================================================================================================================== ===================================================================================================================== =========================================================================== =============================================================================================================================================
|
||||
|
||||
*(Updated on 2015-06-30)*
|
|
@ -1,307 +0,0 @@
|
|||
"""
|
||||
Script to generate the file `index.txt` with information about
|
||||
pytest plugins taken directly from PyPI.
|
||||
|
||||
Usage:
|
||||
python plugins_index.py
|
||||
|
||||
This command will update `index.txt` in the same directory found as this script.
|
||||
This should be issued before every major documentation release to obtain latest
|
||||
versions from PyPI.
|
||||
|
||||
Also includes plugin compatibility between different python and pytest versions,
|
||||
obtained from http://plugincompat.herokuapp.com.
|
||||
"""
|
||||
from __future__ import print_function
|
||||
from collections import namedtuple
|
||||
import datetime
|
||||
from distutils.version import LooseVersion
|
||||
import itertools
|
||||
from optparse import OptionParser
|
||||
import os
|
||||
import sys
|
||||
import pytest
|
||||
|
||||
|
||||
def get_proxy(url):
|
||||
"""
|
||||
wrapper function to obtain a xmlrpc proxy, taking in account import
|
||||
differences between python 2.X and 3.X
|
||||
|
||||
:param url: url to bind the proxy to
|
||||
:return: a ServerProxy instance
|
||||
"""
|
||||
if sys.version_info < (3, 0):
|
||||
from xmlrpclib import ServerProxy
|
||||
else:
|
||||
from xmlrpc.client import ServerProxy
|
||||
return ServerProxy(url)
|
||||
|
||||
|
||||
def iter_plugins(client):
|
||||
"""
|
||||
Returns an iterator of (name, version) from PyPI.
|
||||
|
||||
:param client: ServerProxy
|
||||
:param search: package names to search for
|
||||
"""
|
||||
for plug_data in client.search({'name': 'pytest'}):
|
||||
if plug_data['name'].startswith('pytest-'):
|
||||
yield plug_data['name'], plug_data['version']
|
||||
|
||||
|
||||
def get_latest_versions(plugins):
|
||||
"""
|
||||
Returns an iterator of (name, version) from the given list of (name,
|
||||
version), but returning only the latest version of the package. Uses
|
||||
distutils.LooseVersion to ensure compatibility with PEP386.
|
||||
"""
|
||||
plugins = [(name, LooseVersion(version)) for (name, version) in plugins]
|
||||
for name, grouped_plugins in itertools.groupby(plugins, key=lambda x: x[0]):
|
||||
name, loose_version = list(grouped_plugins)[-1]
|
||||
yield name, str(loose_version)
|
||||
|
||||
|
||||
def obtain_plugins_table(plugins, client, verbose, pytest_ver):
|
||||
"""
|
||||
Returns information to populate a table of plugins, their versions,
|
||||
authors, etc.
|
||||
|
||||
The returned information is a list of columns of `ColumnData`
|
||||
namedtuples(text, link). Link can be None if the text for that column
|
||||
should not be linked to anything.
|
||||
|
||||
:param plugins: list of (name, version)
|
||||
:param client: ServerProxy
|
||||
:param verbose: print plugin name and version as they are fetch
|
||||
:param pytest_ver: pytest version to use.
|
||||
"""
|
||||
if pytest_ver is None:
|
||||
pytest_ver = pytest.__version__
|
||||
|
||||
def get_repo_markup(repo):
|
||||
"""
|
||||
obtains appropriate markup for the given repository, as two lines
|
||||
that should be output in the same table row. We use this to display an icon
|
||||
for known repository hosts (github, etc), just a "?" char when
|
||||
repository is not registered in pypi or a simple link otherwise.
|
||||
"""
|
||||
target = repo
|
||||
if 'github.com' in repo:
|
||||
image = 'github.png'
|
||||
elif 'bitbucket.org' in repo:
|
||||
image = 'bitbucket.png'
|
||||
elif repo.lower() == 'unknown':
|
||||
return '?', ''
|
||||
else:
|
||||
image = None
|
||||
|
||||
if image is not None:
|
||||
image_markup = '.. image:: %s' % image
|
||||
target_markup = ' :target: %s' % repo
|
||||
pad_right = ('%-' + str(len(target_markup)) + 's')
|
||||
return pad_right % image_markup, target_markup
|
||||
else:
|
||||
return ('`link <%s>`_' % target), ''
|
||||
|
||||
def sanitize_summary(summary):
|
||||
"""Make sure summaries don't break our table formatting.
|
||||
"""
|
||||
return summary.replace('\n', ' ')
|
||||
|
||||
rows = []
|
||||
ColumnData = namedtuple('ColumnData', 'text link')
|
||||
headers = ['Name', 'Py27', 'Py34', 'Home', 'Summary']
|
||||
repositories = obtain_override_repositories()
|
||||
print('Generating plugins_index page (pytest-{0})'.format(pytest_ver))
|
||||
plugins = list(plugins)
|
||||
for index, (package_name, version) in enumerate(plugins):
|
||||
if verbose:
|
||||
print(package_name, version, '...', end='')
|
||||
|
||||
release_data = client.release_data(package_name, version)
|
||||
|
||||
common_params = dict(
|
||||
site='http://plugincompat.herokuapp.com',
|
||||
name=package_name,
|
||||
version=version)
|
||||
|
||||
repository = repositories.get(package_name, release_data['home_page'])
|
||||
repo_markup_1, repo_markup_2 = get_repo_markup(repository)
|
||||
|
||||
# first row: name, images and simple links
|
||||
url = '.. image:: {site}/status/{name}-latest'
|
||||
image_url = url.format(**common_params)
|
||||
image_url += '?py={py}&pytest={pytest}'
|
||||
row = (
|
||||
ColumnData(package_name, release_data['package_url']),
|
||||
ColumnData(image_url.format(py='py27', pytest=pytest_ver),
|
||||
None),
|
||||
ColumnData(image_url.format(py='py34', pytest=pytest_ver),
|
||||
None),
|
||||
ColumnData(
|
||||
repo_markup_1,
|
||||
None),
|
||||
ColumnData(sanitize_summary(release_data['summary']), None),
|
||||
)
|
||||
assert len(row) == len(headers)
|
||||
rows.append(row)
|
||||
|
||||
# second row: links for images (they should be in their own line)
|
||||
url = ' :target: {site}/output/{name}-latest'
|
||||
output_url = url.format(**common_params)
|
||||
output_url += '?py={py}&pytest={pytest}'
|
||||
|
||||
row = (
|
||||
ColumnData('', None),
|
||||
ColumnData(output_url.format(py='py27', pytest=pytest_ver),
|
||||
None),
|
||||
ColumnData(output_url.format(py='py34', pytest=pytest_ver),
|
||||
None),
|
||||
ColumnData(repo_markup_2, None),
|
||||
ColumnData('', None),
|
||||
|
||||
)
|
||||
assert len(row) == len(headers)
|
||||
rows.append(row)
|
||||
|
||||
if verbose:
|
||||
print('OK (%d%%)' % ((index + 1) * 100 / len(plugins)))
|
||||
|
||||
print('Done: %d plugins' % len(plugins))
|
||||
|
||||
return headers, rows
|
||||
|
||||
|
||||
def obtain_override_repositories():
|
||||
"""
|
||||
Used to override the "home_page" obtained from pypi to known
|
||||
package repositories. Used when the author didn't fill the "home_page"
|
||||
field in setup.py.
|
||||
|
||||
:return: dict of {package_name: repository_url}
|
||||
"""
|
||||
return {
|
||||
'pytest-blockage': 'https://github.com/rob-b/pytest-blockage',
|
||||
'pytest-konira': 'http://github.com/alfredodeza/pytest-konira',
|
||||
'pytest-sugar': 'https://github.com/Frozenball/pytest-sugar',
|
||||
}
|
||||
|
||||
|
||||
def generate_plugins_index_from_table(filename, headers, rows, pytest_ver):
|
||||
"""
|
||||
Generates a RST file with the table data given.
|
||||
|
||||
:param filename: output filename
|
||||
:param headers: see `obtain_plugins_table`
|
||||
:param rows: see `obtain_plugins_table`
|
||||
:param pytest_ver: see `obtain_plugins_table`
|
||||
"""
|
||||
# creates a list of rows, each being a str containing appropriate column
|
||||
# text and link
|
||||
table_texts = []
|
||||
for row in rows:
|
||||
column_texts = []
|
||||
for i, col_data in enumerate(row):
|
||||
text = '`%s <%s>`_' % (
|
||||
col_data.text,
|
||||
col_data.link) if col_data.link else col_data.text
|
||||
column_texts.append(text)
|
||||
table_texts.append(column_texts)
|
||||
|
||||
# compute max length of each column so we can build the rst table
|
||||
column_lengths = [len(x) for x in headers]
|
||||
for column_texts in table_texts:
|
||||
for i, row_text in enumerate(column_texts):
|
||||
column_lengths[i] = max(column_lengths[i], len(row_text) + 2)
|
||||
|
||||
def get_row_limiter(char):
|
||||
return ' '.join(char * length for length in column_lengths)
|
||||
|
||||
with open(filename, 'w') as f:
|
||||
# header
|
||||
print(HEADER, file=f)
|
||||
print(file=f)
|
||||
|
||||
# table
|
||||
print(get_row_limiter('='), file=f)
|
||||
formatted_headers = [
|
||||
'{0:^{fill}}'.format(header, fill=column_lengths[i])
|
||||
for i, header in enumerate(headers)]
|
||||
print(*formatted_headers, file=f)
|
||||
print(get_row_limiter('='), file=f)
|
||||
|
||||
for column_texts in table_texts:
|
||||
formatted_rows = [
|
||||
'{0:^{fill}}'.format(row_text, fill=column_lengths[i])
|
||||
for i, row_text in enumerate(column_texts)
|
||||
]
|
||||
print(*formatted_rows, file=f)
|
||||
print(file=f)
|
||||
print(get_row_limiter('='), file=f)
|
||||
print(file=f)
|
||||
today = datetime.date.today().strftime('%Y-%m-%d')
|
||||
print('*(Updated on %s)*' % today, file=f)
|
||||
|
||||
|
||||
def generate_plugins_index(client, filename, verbose, pytest_ver):
|
||||
"""
|
||||
Generates an RST file with a table of the latest pytest plugins found in
|
||||
PyPI.
|
||||
|
||||
:param client: ServerProxy
|
||||
:param filename: output filename
|
||||
:param verbose: print name and version of each plugin as they are fetch
|
||||
:param pytest_ver: pytest version to use; if not given, use current pytest
|
||||
version.
|
||||
"""
|
||||
plugins = get_latest_versions(iter_plugins(client))
|
||||
headers, rows = obtain_plugins_table(plugins, client, verbose, pytest_ver)
|
||||
generate_plugins_index_from_table(filename, headers, rows, pytest_ver)
|
||||
|
||||
|
||||
def main(argv):
|
||||
"""
|
||||
Script entry point. Configures an option parser and calls the appropriate
|
||||
internal function.
|
||||
"""
|
||||
filename = os.path.join(os.path.dirname(__file__), 'index.txt')
|
||||
url = 'http://pypi.python.org/pypi'
|
||||
|
||||
parser = OptionParser(
|
||||
description='Generates a restructured document of pytest plugins from PyPI')
|
||||
parser.add_option('-f', '--filename', default=filename,
|
||||
help='output filename [default: %default]')
|
||||
parser.add_option('-u', '--url', default=url,
|
||||
help='url of PyPI server to obtain data from [default: %default]')
|
||||
parser.add_option('-v', '--verbose', default=False, action='store_true',
|
||||
help='verbose output')
|
||||
parser.add_option('--pytest-ver', default=None, action='store',
|
||||
help='generate index for this pytest version (default current version)')
|
||||
(options, _) = parser.parse_args(argv[1:])
|
||||
|
||||
client = get_proxy(options.url)
|
||||
generate_plugins_index(client, options.filename, options.verbose, options.pytest_ver)
|
||||
|
||||
print()
|
||||
print('%s updated.' % options.filename)
|
||||
return 0
|
||||
|
||||
|
||||
# header for the plugins_index page
|
||||
HEADER = '''.. _plugins_index:
|
||||
|
||||
List of Third-Party Plugins
|
||||
===========================
|
||||
|
||||
The table below contains a listing of plugins found in PyPI and
|
||||
their status when tested when using latest py.test and python versions.
|
||||
|
||||
A complete listing can also be found at
|
||||
`plugincompat <http://plugincompat.herokuapp.com/>`_, which contains tests
|
||||
status against other py.test releases.
|
||||
'''
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
sys.exit(main(sys.argv))
|
|
@ -41,6 +41,10 @@ additional information::
|
|||
Alternatively, you can examine raised warnings in detail using the
|
||||
:ref:`recwarn <recwarn>` fixture (see below).
|
||||
|
||||
.. note::
|
||||
``DeprecationWarning`` and ``PendingDeprecationWarning`` are treated
|
||||
differently; see :ref:`ensuring_function_triggers`.
|
||||
|
||||
.. _recwarn:
|
||||
|
||||
Recording warnings
|
||||
|
@ -87,6 +91,9 @@ Each recorded warning has the attributes ``message``, ``category``,
|
|||
class of the warning. The ``message`` is the warning itself; calling
|
||||
``str(message)`` will return the actual message of the warning.
|
||||
|
||||
.. note::
|
||||
``DeprecationWarning`` and ``PendingDeprecationWarning`` are treated
|
||||
differently; see :ref:`ensuring_function_triggers`.
|
||||
|
||||
.. _ensuring_function_triggers:
|
||||
|
||||
|
@ -94,16 +101,17 @@ Ensuring a function triggers a deprecation warning
|
|||
-------------------------------------------------------
|
||||
|
||||
You can also call a global helper for checking
|
||||
that a certain function call triggers a ``DeprecationWarning``::
|
||||
that a certain function call triggers a ``DeprecationWarning`` or
|
||||
``PendingDeprecationWarning``::
|
||||
|
||||
import pytest
|
||||
|
||||
def test_global():
|
||||
pytest.deprecated_call(myfunction, 17)
|
||||
|
||||
By default, deprecation warnings will not be caught when using ``pytest.warns``
|
||||
or ``recwarn``, since the default Python warnings filters hide
|
||||
DeprecationWarnings. If you wish to record them in your own code, use the
|
||||
By default, ``DeprecationWarning`` and ``PendingDeprecationWarning`` will not be
|
||||
caught when using ``pytest.warns`` or ``recwarn`` because default Python warnings filters hide
|
||||
them. If you wish to record them in your own code, use the
|
||||
command ``warnings.simplefilter('always')``::
|
||||
|
||||
import warnings
|
||||
|
|
|
@ -93,8 +93,8 @@ As with all function :ref:`marking <mark>` you can skip test functions at the
|
|||
`whole class- or module level`_. If your code targets python2.6 or above you
|
||||
use the skipif decorator (and any other marker) on classes::
|
||||
|
||||
@pytest.mark.skipif(sys.platform != 'win32',
|
||||
reason="requires windows")
|
||||
@pytest.mark.skipif(sys.platform == 'win32',
|
||||
reason="does not run on windows")
|
||||
class TestPosixCalls:
|
||||
|
||||
def test_function(self):
|
||||
|
@ -107,8 +107,8 @@ If your code targets python2.5 where class-decorators are not available,
|
|||
you can set the ``pytestmark`` attribute of a class::
|
||||
|
||||
class TestPosixCalls:
|
||||
pytestmark = pytest.mark.skipif(sys.platform != 'win32',
|
||||
reason="requires Windows")
|
||||
pytestmark = pytest.mark.skipif(sys.platform == 'win32',
|
||||
reason="does not run on windows")
|
||||
|
||||
def test_function(self):
|
||||
"will not be setup or run under 'win32' platform"
|
||||
|
@ -175,7 +175,7 @@ Running it with the report-on-xfail option gives this output::
|
|||
|
||||
example $ py.test -rx xfail_demo.py
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR/example, inifile:
|
||||
collected 7 items
|
||||
|
||||
|
|
|
@ -14,6 +14,9 @@ Talks and blog postings
|
|||
.. _`tutorial1 repository`: http://bitbucket.org/pytest-dev/pytest-tutorial1/
|
||||
.. _`pycon 2010 tutorial PDF`: http://bitbucket.org/pytest-dev/pytest-tutorial1/raw/tip/pytest-basic.pdf
|
||||
|
||||
- `Improve your testing with Pytest and Mock, Gabe Hollombe, PyCon SG 2015
|
||||
<https://www.youtube.com/watch?v=RcN26hznmk4>`_.
|
||||
|
||||
- `Introduction to pytest, Andreas Pelme, EuroPython 2014
|
||||
<https://www.youtube.com/watch?v=LdVJj65ikRY>`_.
|
||||
|
||||
|
|
|
@ -29,7 +29,7 @@ Running this would result in a passed test except for the last
|
|||
|
||||
$ py.test test_tmpdir.py
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 1 items
|
||||
|
||||
|
|
|
@ -88,7 +88,7 @@ the ``self.db`` values in the traceback::
|
|||
|
||||
$ py.test test_unittest_db.py
|
||||
======= test session starts ========
|
||||
platform linux -- Python 3.4.3, pytest-2.8.1, py-1.4.30, pluggy-0.3.1
|
||||
platform linux -- Python 3.4.3, pytest-2.8.4, py-1.4.30, pluggy-0.3.1
|
||||
rootdir: $REGENDOC_TMPDIR, inifile:
|
||||
collected 2 items
|
||||
|
||||
|
@ -126,7 +126,7 @@ when writing the class-scoped fixture function above.
|
|||
autouse fixtures and accessing other fixtures
|
||||
-------------------------------------------------------------------
|
||||
|
||||
Although it's usually better to explicitely declare use of fixtures you need
|
||||
Although it's usually better to explicitly declare use of fixtures you need
|
||||
for a given test, you may sometimes want to have fixtures that are
|
||||
automatically used in a given context. After all, the traditional
|
||||
style of unittest-setup mandates the use of this implicit fixture writing
|
||||
|
@ -180,3 +180,11 @@ was executed ahead of the ``test_method``.
|
|||
to selectively leave away the ``unittest.TestCase`` subclassing, use
|
||||
plain asserts and get the unlimited pytest feature set.
|
||||
|
||||
|
||||
Converting from unittest to pytest
|
||||
---------------------------------------
|
||||
|
||||
If you want to convert your unittest testcases to pytest, there are
|
||||
some helpers like `unittest2pytest
|
||||
<https://pypi.python.org/pypi/unittest2pytest/>`__, which uses lib2to3
|
||||
and introspection for the transformation.
|
||||
|
|
|
@ -46,7 +46,7 @@ Several test run options::
|
|||
py.test test_mod.py # run tests in module
|
||||
py.test somepath # run all tests below somepath
|
||||
py.test -k stringexpr # only run tests with names that match the
|
||||
# the "string expression", e.g. "MyClass and not method"
|
||||
# "string expression", e.g. "MyClass and not method"
|
||||
# will select TestMyClass.test_something
|
||||
# but not TestMyClass.test_method_simple
|
||||
py.test test_mod.py::test_func # only run tests that match the "node ID",
|
||||
|
@ -123,7 +123,7 @@ automatically disables its output capture when you enter PDB_ tracing:
|
|||
such.
|
||||
* Any later output produced within the same test will not be captured and will
|
||||
instead get sent directly to ``sys.stdout``. Note that this holds true even
|
||||
for test output occuring after you exit the interactive PDB_ tracing session
|
||||
for test output occurring after you exit the interactive PDB_ tracing session
|
||||
and continue with the regular test run.
|
||||
|
||||
.. versionadded: 2.4.0
|
||||
|
|
|
@ -16,7 +16,7 @@ reporting by calling `well specified hooks`_ of the following plugins:
|
|||
|
||||
* :ref:`builtin plugins`: loaded from pytest's internal ``_pytest`` directory.
|
||||
|
||||
* :ref:`external plugins <plugins_index>`: modules discovered through
|
||||
* :ref:`external plugins <extplugins>`: modules discovered through
|
||||
`setuptools entry points`_
|
||||
|
||||
* `conftest.py plugins`_: modules auto-discovered in test directories
|
||||
|
@ -110,7 +110,7 @@ you can copy from:
|
|||
|
||||
* a custom collection example plugin: :ref:`yaml plugin`
|
||||
* around 20 doc:`builtin plugins` which provide pytest's own functionality
|
||||
* many :ref:`external plugins <plugins_index>` providing additional features
|
||||
* many `external plugins <http://plugincompat.herokuapp.com>`_ providing additional features
|
||||
|
||||
All of these plugins implement the documented `well specified hooks`_
|
||||
to extend and add functionality.
|
||||
|
@ -203,13 +203,13 @@ pytest comes with some facilities that you can enable for testing your
|
|||
plugin. Given that you have an installed plugin you can enable the
|
||||
:py:class:`testdir <_pytest.pytester.Testdir>` fixture via specifying a
|
||||
command line option to include the pytester plugin (``-p pytester``) or
|
||||
by putting ``pytest_plugins = pytester`` into your test or
|
||||
``conftest.py`` file. You then will have a ``testdir`` fixure which you
|
||||
by putting ``pytest_plugins = "pytester"`` into your test or
|
||||
``conftest.py`` file. You then will have a ``testdir`` fixture which you
|
||||
can use like this::
|
||||
|
||||
# content of test_myplugin.py
|
||||
|
||||
pytest_plugins = pytester # to get testdir fixture
|
||||
pytest_plugins = "pytester" # to get testdir fixture
|
||||
|
||||
def test_myplugin(testdir):
|
||||
testdir.makepyfile("""
|
||||
|
@ -332,17 +332,17 @@ after others, i.e. the position in the ``N``-sized list of functions:
|
|||
.. code-block:: python
|
||||
|
||||
# Plugin 1
|
||||
@pytest.hookimpl_spec(tryfirst=True)
|
||||
@pytest.hookimpl(tryfirst=True)
|
||||
def pytest_collection_modifyitems(items):
|
||||
# will execute as early as possible
|
||||
|
||||
# Plugin 2
|
||||
@pytest.hookimpl_spec(trylast=True)
|
||||
@pytest.hookimpl(trylast=True)
|
||||
def pytest_collection_modifyitems(items):
|
||||
# will execute as late as possible
|
||||
|
||||
# Plugin 3
|
||||
@pytest.hookimpl_spec(hookwrapper=True)
|
||||
@pytest.hookimpl(hookwrapper=True)
|
||||
def pytest_collection_modifyitems(items):
|
||||
# will execute even before the tryfirst one above!
|
||||
outcome = yield
|
||||
|
@ -386,7 +386,7 @@ are expected.
|
|||
|
||||
For an example, see `newhooks.py`_ from :ref:`xdist`.
|
||||
|
||||
.. _`newhooks.py`: https://bitbucket.org/pytest-dev/pytest-xdist/src/52082f70e7dd04b00361091b8af906c60fd6700f/xdist/newhooks.py?at=default
|
||||
.. _`newhooks.py`: https://github.com/pytest-dev/pytest-xdist/blob/974bd566c599dc6a9ea291838c6f226197208b46/xdist/newhooks.py
|
||||
|
||||
|
||||
Optionally using hooks from 3rd party plugins
|
||||
|
|
|
@ -95,6 +95,16 @@ class TestClass:
|
|||
"*1 passed*",
|
||||
])
|
||||
|
||||
def test_issue1035_obj_has_getattr(self, testdir):
|
||||
modcol = testdir.getmodulecol("""
|
||||
class Chameleon(object):
|
||||
def __getattr__(self, name):
|
||||
return True
|
||||
chameleon = Chameleon()
|
||||
""")
|
||||
colitems = modcol.collect()
|
||||
assert len(colitems) == 0
|
||||
|
||||
|
||||
class TestGenerator:
|
||||
def test_generative_functions(self, testdir):
|
||||
|
@ -733,6 +743,78 @@ class TestTracebackCutting:
|
|||
"E*NameError*",
|
||||
])
|
||||
|
||||
def test_traceback_filter_error_during_fixture_collection(self, testdir):
|
||||
"""integration test for issue #995.
|
||||
"""
|
||||
testdir.makepyfile("""
|
||||
import pytest
|
||||
|
||||
def fail_me(func):
|
||||
ns = {}
|
||||
exec('def w(): raise ValueError("fail me")', ns)
|
||||
return ns['w']
|
||||
|
||||
@pytest.fixture(scope='class')
|
||||
@fail_me
|
||||
def fail_fixture():
|
||||
pass
|
||||
|
||||
def test_failing_fixture(fail_fixture):
|
||||
pass
|
||||
""")
|
||||
result = testdir.runpytest()
|
||||
assert result.ret != 0
|
||||
out = result.stdout.str()
|
||||
assert "INTERNALERROR>" not in out
|
||||
result.stdout.fnmatch_lines([
|
||||
"*ValueError: fail me*",
|
||||
"* 1 error in *",
|
||||
])
|
||||
|
||||
def test_filter_traceback_generated_code(self):
|
||||
"""test that filter_traceback() works with the fact that
|
||||
py.code.Code.path attribute might return an str object.
|
||||
In this case, one of the entries on the traceback was produced by
|
||||
dynamically generated code.
|
||||
See: https://bitbucket.org/pytest-dev/py/issues/71
|
||||
This fixes #995.
|
||||
"""
|
||||
from _pytest.python import filter_traceback
|
||||
try:
|
||||
ns = {}
|
||||
exec('def foo(): raise ValueError', ns)
|
||||
ns['foo']()
|
||||
except ValueError:
|
||||
_, _, tb = sys.exc_info()
|
||||
|
||||
tb = py.code.Traceback(tb)
|
||||
assert isinstance(tb[-1].path, str)
|
||||
assert not filter_traceback(tb[-1])
|
||||
|
||||
def test_filter_traceback_path_no_longer_valid(self, testdir):
|
||||
"""test that filter_traceback() works with the fact that
|
||||
py.code.Code.path attribute might return an str object.
|
||||
In this case, one of the files in the traceback no longer exists.
|
||||
This fixes #1133.
|
||||
"""
|
||||
from _pytest.python import filter_traceback
|
||||
testdir.syspathinsert()
|
||||
testdir.makepyfile(filter_traceback_entry_as_str='''
|
||||
def foo():
|
||||
raise ValueError
|
||||
''')
|
||||
try:
|
||||
import filter_traceback_entry_as_str
|
||||
filter_traceback_entry_as_str.foo()
|
||||
except ValueError:
|
||||
_, _, tb = sys.exc_info()
|
||||
|
||||
testdir.tmpdir.join('filter_traceback_entry_as_str.py').remove()
|
||||
tb = py.code.Traceback(tb)
|
||||
assert isinstance(tb[-1].path, str)
|
||||
assert filter_traceback(tb[-1])
|
||||
|
||||
|
||||
class TestReportInfo:
|
||||
def test_itemreport_reportinfo(self, testdir, linecomp):
|
||||
testdir.makeconftest("""
|
||||
|
@ -798,6 +880,21 @@ class TestReportInfo:
|
|||
pass
|
||||
"""
|
||||
|
||||
def test_reportinfo_with_nasty_getattr(self, testdir):
|
||||
# https://github.com/pytest-dev/pytest/issues/1204
|
||||
modcol = testdir.getmodulecol("""
|
||||
# lineno 0
|
||||
class TestClass:
|
||||
def __getattr__(self, name):
|
||||
return "this is not an int"
|
||||
|
||||
def test_foo(self):
|
||||
pass
|
||||
""")
|
||||
classcol = testdir.collect_by_name(modcol, "TestClass")
|
||||
instance = classcol.collect()[0]
|
||||
fspath, lineno, msg = instance.reportinfo()
|
||||
|
||||
|
||||
def test_customized_python_discovery(testdir):
|
||||
testdir.makeini("""
|
||||
|
@ -955,3 +1052,27 @@ def test_collect_functools_partial(testdir):
|
|||
""")
|
||||
result = testdir.inline_run()
|
||||
result.assertoutcome(passed=6, failed=2)
|
||||
|
||||
|
||||
def test_dont_collect_non_function_callable(testdir):
|
||||
"""Test for issue https://github.com/pytest-dev/pytest/issues/331
|
||||
|
||||
In this case an INTERNALERROR occurred trying to report the failure of
|
||||
a test like this one because py test failed to get the source lines.
|
||||
"""
|
||||
testdir.makepyfile("""
|
||||
class Oh(object):
|
||||
def __call__(self):
|
||||
pass
|
||||
|
||||
test_a = Oh()
|
||||
|
||||
def test_real():
|
||||
pass
|
||||
""")
|
||||
result = testdir.runpytest('-rw')
|
||||
result.stdout.fnmatch_lines([
|
||||
'*collected 1 item*',
|
||||
'WC2 *',
|
||||
'*1 passed, 1 pytest-warnings in *',
|
||||
])
|
||||
|
|
|
@ -283,6 +283,35 @@ class TestNoselikeTestAttribute:
|
|||
assert len(call.items) == 1
|
||||
assert call.items[0].cls.__name__ == "TC"
|
||||
|
||||
def test_class_with_nasty_getattr(self, testdir):
|
||||
"""Make sure we handle classes with a custom nasty __getattr__ right.
|
||||
|
||||
With a custom __getattr__ which e.g. returns a function (like with a
|
||||
RPC wrapper), we shouldn't assume this meant "__test__ = True".
|
||||
"""
|
||||
# https://github.com/pytest-dev/pytest/issues/1204
|
||||
testdir.makepyfile("""
|
||||
class MetaModel(type):
|
||||
|
||||
def __getattr__(cls, key):
|
||||
return lambda: None
|
||||
|
||||
|
||||
BaseModel = MetaModel('Model', (), {})
|
||||
|
||||
|
||||
class Model(BaseModel):
|
||||
|
||||
__metaclass__ = MetaModel
|
||||
|
||||
def test_blah(self):
|
||||
pass
|
||||
""")
|
||||
reprec = testdir.inline_run()
|
||||
assert not reprec.getfailedcollections()
|
||||
call = reprec.getcalls("pytest_collection_modifyitems")[0]
|
||||
assert not call.items
|
||||
|
||||
|
||||
@pytest.mark.issue351
|
||||
class TestParameterize:
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
import re
|
||||
|
||||
import pytest, py
|
||||
|
@ -118,6 +119,41 @@ class TestMetafunc:
|
|||
assert metafunc._calls[2].id == "x1-a"
|
||||
assert metafunc._calls[3].id == "x1-b"
|
||||
|
||||
@pytest.mark.skipif('sys.version_info[0] >= 3')
|
||||
def test_unicode_idval_python2(self):
|
||||
"""unittest for the expected behavior to obtain ids for parametrized
|
||||
unicode values in Python 2: if convertible to ascii, they should appear
|
||||
as ascii values, otherwise fallback to hide the value behind the name
|
||||
of the parametrized variable name. #1086
|
||||
"""
|
||||
from _pytest.python import _idval
|
||||
values = [
|
||||
(u'', ''),
|
||||
(u'ascii', 'ascii'),
|
||||
(u'ação', 'a6'),
|
||||
(u'josé@blah.com', 'a6'),
|
||||
(u'δοκ.ιμή@παράδειγμα.δοκιμή', 'a6'),
|
||||
]
|
||||
for val, expected in values:
|
||||
assert _idval(val, 'a', 6, None) == expected
|
||||
|
||||
def test_bytes_idval(self):
|
||||
"""unittest for the expected behavior to obtain ids for parametrized
|
||||
bytes values:
|
||||
- python2: non-ascii strings are considered bytes and formatted using
|
||||
"binary escape", where any byte < 127 is escaped into its hex form.
|
||||
- python3: bytes objects are always escaped using "binary escape".
|
||||
"""
|
||||
from _pytest.python import _idval
|
||||
values = [
|
||||
(b'', ''),
|
||||
(b'\xc3\xb4\xff\xe4', '\\xc3\\xb4\\xff\\xe4'),
|
||||
(b'ascii', 'ascii'),
|
||||
(u'αρά'.encode('utf-8'), '\\xce\\xb1\\xcf\\x81\\xce\\xac'),
|
||||
]
|
||||
for val, expected in values:
|
||||
assert _idval(val, 'a', 6, None) == expected
|
||||
|
||||
@pytest.mark.issue250
|
||||
def test_idmaker_autoname(self):
|
||||
from _pytest.python import idmaker
|
||||
|
|
|
@ -371,38 +371,6 @@ class TestDoctests:
|
|||
"--junit-xml=junit.xml")
|
||||
reprec.assertoutcome(failed=1)
|
||||
|
||||
def test_doctest_module_session_fixture(self, testdir):
|
||||
"""Test that session fixtures are initialized for doctest modules (#768)
|
||||
"""
|
||||
# session fixture which changes some global data, which will
|
||||
# be accessed by doctests in a module
|
||||
testdir.makeconftest("""
|
||||
import pytest
|
||||
import sys
|
||||
|
||||
@pytest.yield_fixture(autouse=True, scope='session')
|
||||
def myfixture():
|
||||
assert not hasattr(sys, 'pytest_session_data')
|
||||
sys.pytest_session_data = 1
|
||||
yield
|
||||
del sys.pytest_session_data
|
||||
""")
|
||||
testdir.makepyfile(foo="""
|
||||
import sys
|
||||
|
||||
def foo():
|
||||
'''
|
||||
>>> assert sys.pytest_session_data == 1
|
||||
'''
|
||||
|
||||
def bar():
|
||||
'''
|
||||
>>> assert sys.pytest_session_data == 1
|
||||
'''
|
||||
""")
|
||||
result = testdir.runpytest("--doctest-modules")
|
||||
result.stdout.fnmatch_lines('*2 passed*')
|
||||
|
||||
@pytest.mark.parametrize('config_mode', ['ini', 'comment'])
|
||||
def test_allow_unicode(self, testdir, config_mode):
|
||||
"""Test that doctests which output unicode work in all python versions
|
||||
|
@ -446,7 +414,7 @@ class TestDoctests:
|
|||
reprec.assertoutcome(passed=passed, failed=int(not passed))
|
||||
|
||||
|
||||
class TestDocTestSkips:
|
||||
class TestDoctestSkips:
|
||||
"""
|
||||
If all examples in a doctest are skipped due to the SKIP option, then
|
||||
the tests should be SKIPPED rather than PASSED. (#957)
|
||||
|
@ -493,3 +461,122 @@ class TestDocTestSkips:
|
|||
""")
|
||||
reprec = testdir.inline_run("--doctest-modules")
|
||||
reprec.assertoutcome(skipped=1)
|
||||
|
||||
|
||||
class TestDoctestAutoUseFixtures:
|
||||
|
||||
SCOPES = ['module', 'session', 'class', 'function']
|
||||
|
||||
def test_doctest_module_session_fixture(self, testdir):
|
||||
"""Test that session fixtures are initialized for doctest modules (#768)
|
||||
"""
|
||||
# session fixture which changes some global data, which will
|
||||
# be accessed by doctests in a module
|
||||
testdir.makeconftest("""
|
||||
import pytest
|
||||
import sys
|
||||
|
||||
@pytest.yield_fixture(autouse=True, scope='session')
|
||||
def myfixture():
|
||||
assert not hasattr(sys, 'pytest_session_data')
|
||||
sys.pytest_session_data = 1
|
||||
yield
|
||||
del sys.pytest_session_data
|
||||
""")
|
||||
testdir.makepyfile(foo="""
|
||||
import sys
|
||||
|
||||
def foo():
|
||||
'''
|
||||
>>> assert sys.pytest_session_data == 1
|
||||
'''
|
||||
|
||||
def bar():
|
||||
'''
|
||||
>>> assert sys.pytest_session_data == 1
|
||||
'''
|
||||
""")
|
||||
result = testdir.runpytest("--doctest-modules")
|
||||
result.stdout.fnmatch_lines('*2 passed*')
|
||||
|
||||
@pytest.mark.parametrize('scope', SCOPES)
|
||||
@pytest.mark.parametrize('enable_doctest', [True, False])
|
||||
def test_fixture_scopes(self, testdir, scope, enable_doctest):
|
||||
"""Test that auto-use fixtures work properly with doctest modules.
|
||||
See #1057 and #1100.
|
||||
"""
|
||||
testdir.makeconftest('''
|
||||
import pytest
|
||||
|
||||
@pytest.fixture(autouse=True, scope="{scope}")
|
||||
def auto(request):
|
||||
return 99
|
||||
'''.format(scope=scope))
|
||||
testdir.makepyfile(test_1='''
|
||||
def test_foo():
|
||||
"""
|
||||
>>> getfixture('auto') + 1
|
||||
100
|
||||
"""
|
||||
def test_bar():
|
||||
assert 1
|
||||
''')
|
||||
params = ('--doctest-modules',) if enable_doctest else ()
|
||||
passes = 3 if enable_doctest else 2
|
||||
result = testdir.runpytest(*params)
|
||||
result.stdout.fnmatch_lines(['*=== %d passed in *' % passes])
|
||||
|
||||
@pytest.mark.parametrize('scope', SCOPES)
|
||||
@pytest.mark.parametrize('autouse', [True, False])
|
||||
@pytest.mark.parametrize('use_fixture_in_doctest', [True, False])
|
||||
def test_fixture_module_doctest_scopes(self, testdir, scope, autouse,
|
||||
use_fixture_in_doctest):
|
||||
"""Test that auto-use fixtures work properly with doctest files.
|
||||
See #1057 and #1100.
|
||||
"""
|
||||
testdir.makeconftest('''
|
||||
import pytest
|
||||
|
||||
@pytest.fixture(autouse={autouse}, scope="{scope}")
|
||||
def auto(request):
|
||||
return 99
|
||||
'''.format(scope=scope, autouse=autouse))
|
||||
if use_fixture_in_doctest:
|
||||
testdir.maketxtfile(test_doc="""
|
||||
>>> getfixture('auto')
|
||||
99
|
||||
""")
|
||||
else:
|
||||
testdir.maketxtfile(test_doc="""
|
||||
>>> 1 + 1
|
||||
2
|
||||
""")
|
||||
result = testdir.runpytest('--doctest-modules')
|
||||
assert 'FAILURES' not in str(result.stdout.str())
|
||||
result.stdout.fnmatch_lines(['*=== 1 passed in *'])
|
||||
|
||||
@pytest.mark.parametrize('scope', SCOPES)
|
||||
def test_auto_use_request_attributes(self, testdir, scope):
|
||||
"""Check that all attributes of a request in an autouse fixture
|
||||
behave as expected when requested for a doctest item.
|
||||
"""
|
||||
testdir.makeconftest('''
|
||||
import pytest
|
||||
|
||||
@pytest.fixture(autouse=True, scope="{scope}")
|
||||
def auto(request):
|
||||
if "{scope}" == 'module':
|
||||
assert request.module is None
|
||||
if "{scope}" == 'class':
|
||||
assert request.cls is None
|
||||
if "{scope}" == 'function':
|
||||
assert request.function is None
|
||||
return 99
|
||||
'''.format(scope=scope))
|
||||
testdir.maketxtfile(test_doc="""
|
||||
>>> 1 + 1
|
||||
2
|
||||
""")
|
||||
result = testdir.runpytest('--doctest-modules')
|
||||
assert 'FAILURES' not in str(result.stdout.str())
|
||||
result.stdout.fnmatch_lines(['*=== 1 passed in *'])
|
|
@ -2,7 +2,9 @@
|
|||
|
||||
from xml.dom import minidom
|
||||
from _pytest.main import EXIT_NOTESTSCOLLECTED
|
||||
import py, sys, os
|
||||
import py
|
||||
import sys
|
||||
import os
|
||||
from _pytest.junitxml import LogXML
|
||||
import pytest
|
||||
|
||||
|
@ -11,16 +13,71 @@ def runandparse(testdir, *args):
|
|||
resultpath = testdir.tmpdir.join("junit.xml")
|
||||
result = testdir.runpytest("--junitxml=%s" % resultpath, *args)
|
||||
xmldoc = minidom.parse(str(resultpath))
|
||||
return result, xmldoc
|
||||
return result, DomNode(xmldoc)
|
||||
|
||||
|
||||
def assert_attr(node, **kwargs):
|
||||
__tracebackhide__ = True
|
||||
for name, expected in kwargs.items():
|
||||
def nodeval(node, name):
|
||||
anode = node.getAttributeNode(name)
|
||||
assert anode, "node %r has no attribute %r" %(node, name)
|
||||
val = anode.value
|
||||
if val != str(expected):
|
||||
py.test.fail("%r != %r" %(str(val), str(expected)))
|
||||
if anode is not None:
|
||||
return anode.value
|
||||
|
||||
expected = dict((name, str(value)) for name, value in kwargs.items())
|
||||
on_node = dict((name, nodeval(node, name)) for name in expected)
|
||||
assert on_node == expected
|
||||
|
||||
|
||||
class DomNode(object):
|
||||
def __init__(self, dom):
|
||||
self.__node = dom
|
||||
|
||||
def __repr__(self):
|
||||
return self.__node.toxml()
|
||||
|
||||
def find_first_by_tag(self, tag):
|
||||
return self.find_nth_by_tag(tag, 0)
|
||||
|
||||
def _by_tag(self, tag):
|
||||
return self.__node.getElementsByTagName(tag)
|
||||
|
||||
def find_nth_by_tag(self, tag, n):
|
||||
items = self._by_tag(tag)
|
||||
try:
|
||||
nth = items[n]
|
||||
except IndexError:
|
||||
pass
|
||||
else:
|
||||
return type(self)(nth)
|
||||
|
||||
def find_by_tag(self, tag):
|
||||
t = type(self)
|
||||
return [t(x) for x in self.__node.getElementsByTagName(tag)]
|
||||
|
||||
def __getitem__(self, key):
|
||||
node = self.__node.getAttributeNode(key)
|
||||
if node is not None:
|
||||
return node.value
|
||||
|
||||
def assert_attr(self, **kwargs):
|
||||
__tracebackhide__ = True
|
||||
return assert_attr(self.__node, **kwargs)
|
||||
|
||||
def toxml(self):
|
||||
return self.__node.toxml()
|
||||
|
||||
@property
|
||||
def text(self):
|
||||
return self.__node.childNodes[0].wholeText
|
||||
|
||||
@property
|
||||
def tag(self):
|
||||
return self.__node.tagName
|
||||
|
||||
@property
|
||||
def next_siebling(self):
|
||||
return type(self)(self.__node.nextSibling)
|
||||
|
||||
|
||||
class TestPython:
|
||||
def test_summing_simple(self, testdir):
|
||||
|
@ -41,8 +98,8 @@ class TestPython:
|
|||
""")
|
||||
result, dom = runandparse(testdir)
|
||||
assert result.ret
|
||||
node = dom.getElementsByTagName("testsuite")[0]
|
||||
assert_attr(node, name="pytest", errors=0, failures=1, skips=3, tests=2)
|
||||
node = dom.find_first_by_tag("testsuite")
|
||||
node.assert_attr(name="pytest", errors=0, failures=1, skips=3, tests=2)
|
||||
|
||||
def test_timing_function(self, testdir):
|
||||
testdir.makepyfile("""
|
||||
|
@ -55,9 +112,9 @@ class TestPython:
|
|||
time.sleep(0.01)
|
||||
""")
|
||||
result, dom = runandparse(testdir)
|
||||
node = dom.getElementsByTagName("testsuite")[0]
|
||||
tnode = node.getElementsByTagName("testcase")[0]
|
||||
val = tnode.getAttributeNode("time").value
|
||||
node = dom.find_first_by_tag("testsuite")
|
||||
tnode = node.find_first_by_tag("testcase")
|
||||
val = tnode["time"]
|
||||
assert round(float(val), 2) >= 0.03
|
||||
|
||||
def test_setup_error(self, testdir):
|
||||
|
@ -69,16 +126,16 @@ class TestPython:
|
|||
""")
|
||||
result, dom = runandparse(testdir)
|
||||
assert result.ret
|
||||
node = dom.getElementsByTagName("testsuite")[0]
|
||||
assert_attr(node, errors=1, tests=0)
|
||||
tnode = node.getElementsByTagName("testcase")[0]
|
||||
assert_attr(tnode,
|
||||
node = dom.find_first_by_tag("testsuite")
|
||||
node.assert_attr(errors=1, tests=0)
|
||||
tnode = node.find_first_by_tag("testcase")
|
||||
tnode.assert_attr(
|
||||
file="test_setup_error.py",
|
||||
line="2",
|
||||
classname="test_setup_error",
|
||||
name="test_function")
|
||||
fnode = tnode.getElementsByTagName("error")[0]
|
||||
assert_attr(fnode, message="test setup failure")
|
||||
fnode = tnode.find_first_by_tag("error")
|
||||
fnode.assert_attr(message="test setup failure")
|
||||
assert "ValueError" in fnode.toxml()
|
||||
|
||||
def test_skip_contains_name_reason(self, testdir):
|
||||
|
@ -89,19 +146,16 @@ class TestPython:
|
|||
""")
|
||||
result, dom = runandparse(testdir)
|
||||
assert result.ret == 0
|
||||
node = dom.getElementsByTagName("testsuite")[0]
|
||||
assert_attr(node, skips=1)
|
||||
tnode = node.getElementsByTagName("testcase")[0]
|
||||
assert_attr(tnode,
|
||||
node = dom.find_first_by_tag("testsuite")
|
||||
node.assert_attr(skips=1)
|
||||
tnode = node.find_first_by_tag("testcase")
|
||||
tnode.assert_attr(
|
||||
file="test_skip_contains_name_reason.py",
|
||||
line="1",
|
||||
classname="test_skip_contains_name_reason",
|
||||
name="test_skip")
|
||||
snode = tnode.getElementsByTagName("skipped")[0]
|
||||
assert_attr(snode,
|
||||
type="pytest.skip",
|
||||
message="hello23",
|
||||
)
|
||||
snode = tnode.find_first_by_tag("skipped")
|
||||
snode.assert_attr(type="pytest.skip", message="hello23", )
|
||||
|
||||
def test_classname_instance(self, testdir):
|
||||
testdir.makepyfile("""
|
||||
|
@ -111,10 +165,10 @@ class TestPython:
|
|||
""")
|
||||
result, dom = runandparse(testdir)
|
||||
assert result.ret
|
||||
node = dom.getElementsByTagName("testsuite")[0]
|
||||
assert_attr(node, failures=1)
|
||||
tnode = node.getElementsByTagName("testcase")[0]
|
||||
assert_attr(tnode,
|
||||
node = dom.find_first_by_tag("testsuite")
|
||||
node.assert_attr(failures=1)
|
||||
tnode = node.find_first_by_tag("testcase")
|
||||
tnode.assert_attr(
|
||||
file="test_classname_instance.py",
|
||||
line="1",
|
||||
classname="test_classname_instance.TestClass",
|
||||
|
@ -125,10 +179,10 @@ class TestPython:
|
|||
p.write("def test_func(): 0/0")
|
||||
result, dom = runandparse(testdir)
|
||||
assert result.ret
|
||||
node = dom.getElementsByTagName("testsuite")[0]
|
||||
assert_attr(node, failures=1)
|
||||
tnode = node.getElementsByTagName("testcase")[0]
|
||||
assert_attr(tnode,
|
||||
node = dom.find_first_by_tag("testsuite")
|
||||
node.assert_attr(failures=1)
|
||||
tnode = node.find_first_by_tag("testcase")
|
||||
tnode.assert_attr(
|
||||
file=os.path.join("sub", "test_hello.py"),
|
||||
line="0",
|
||||
classname="sub.test_hello",
|
||||
|
@ -139,12 +193,12 @@ class TestPython:
|
|||
testdir.makepyfile("def test_function(): pass")
|
||||
result, dom = runandparse(testdir)
|
||||
assert result.ret
|
||||
node = dom.getElementsByTagName("testsuite")[0]
|
||||
assert_attr(node, errors=1, tests=0)
|
||||
tnode = node.getElementsByTagName("testcase")[0]
|
||||
assert_attr(tnode, classname="pytest", name="internal")
|
||||
fnode = tnode.getElementsByTagName("error")[0]
|
||||
assert_attr(fnode, message="internal error")
|
||||
node = dom.find_first_by_tag("testsuite")
|
||||
node.assert_attr(errors=1, tests=0)
|
||||
tnode = node.find_first_by_tag("testcase")
|
||||
tnode.assert_attr(classname="pytest", name="internal")
|
||||
fnode = tnode.find_first_by_tag("error")
|
||||
fnode.assert_attr(message="internal error")
|
||||
assert "Division" in fnode.toxml()
|
||||
|
||||
def test_failure_function(self, testdir):
|
||||
|
@ -158,22 +212,22 @@ class TestPython:
|
|||
|
||||
result, dom = runandparse(testdir)
|
||||
assert result.ret
|
||||
node = dom.getElementsByTagName("testsuite")[0]
|
||||
assert_attr(node, failures=1, tests=1)
|
||||
tnode = node.getElementsByTagName("testcase")[0]
|
||||
assert_attr(tnode,
|
||||
node = dom.find_first_by_tag("testsuite")
|
||||
node.assert_attr(failures=1, tests=1)
|
||||
tnode = node.find_first_by_tag("testcase")
|
||||
tnode.assert_attr(
|
||||
file="test_failure_function.py",
|
||||
line="1",
|
||||
classname="test_failure_function",
|
||||
name="test_fail")
|
||||
fnode = tnode.getElementsByTagName("failure")[0]
|
||||
assert_attr(fnode, message="ValueError: 42")
|
||||
fnode = tnode.find_first_by_tag("failure")
|
||||
fnode.assert_attr(message="ValueError: 42")
|
||||
assert "ValueError" in fnode.toxml()
|
||||
systemout = fnode.nextSibling
|
||||
assert systemout.tagName == "system-out"
|
||||
systemout = fnode.next_siebling
|
||||
assert systemout.tag == "system-out"
|
||||
assert "hello-stdout" in systemout.toxml()
|
||||
systemerr = systemout.nextSibling
|
||||
assert systemerr.tagName == "system-err"
|
||||
systemerr = systemout.next_siebling
|
||||
assert systemerr.tag == "system-err"
|
||||
assert "hello-stderr" in systemerr.toxml()
|
||||
|
||||
def test_failure_verbose_message(self, testdir):
|
||||
|
@ -184,10 +238,10 @@ class TestPython:
|
|||
""")
|
||||
|
||||
result, dom = runandparse(testdir)
|
||||
node = dom.getElementsByTagName("testsuite")[0]
|
||||
tnode = node.getElementsByTagName("testcase")[0]
|
||||
fnode = tnode.getElementsByTagName("failure")[0]
|
||||
assert_attr(fnode, message="AssertionError: An error assert 0")
|
||||
node = dom.find_first_by_tag("testsuite")
|
||||
tnode = node.find_first_by_tag("testcase")
|
||||
fnode = tnode.find_first_by_tag("failure")
|
||||
fnode.assert_attr(message="AssertionError: An error assert 0")
|
||||
|
||||
def test_failure_escape(self, testdir):
|
||||
testdir.makepyfile("""
|
||||
|
@ -199,22 +253,21 @@ class TestPython:
|
|||
""")
|
||||
result, dom = runandparse(testdir)
|
||||
assert result.ret
|
||||
node = dom.getElementsByTagName("testsuite")[0]
|
||||
assert_attr(node, failures=3, tests=3)
|
||||
node = dom.find_first_by_tag("testsuite")
|
||||
node.assert_attr(failures=3, tests=3)
|
||||
|
||||
for index, char in enumerate("<&'"):
|
||||
|
||||
tnode = node.getElementsByTagName("testcase")[index]
|
||||
assert_attr(tnode,
|
||||
tnode = node.find_nth_by_tag("testcase", index)
|
||||
tnode.assert_attr(
|
||||
file="test_failure_escape.py",
|
||||
line="1",
|
||||
classname="test_failure_escape",
|
||||
name="test_func[%s]" % char)
|
||||
sysout = tnode.getElementsByTagName('system-out')[0]
|
||||
text = sysout.childNodes[0].wholeText
|
||||
sysout = tnode.find_first_by_tag('system-out')
|
||||
text = sysout.text
|
||||
assert text == '%s\n' % char
|
||||
|
||||
|
||||
def test_junit_prefixing(self, testdir):
|
||||
testdir.makepyfile("""
|
||||
def test_func():
|
||||
|
@ -225,20 +278,20 @@ class TestPython:
|
|||
""")
|
||||
result, dom = runandparse(testdir, "--junitprefix=xyz")
|
||||
assert result.ret
|
||||
node = dom.getElementsByTagName("testsuite")[0]
|
||||
assert_attr(node, failures=1, tests=2)
|
||||
tnode = node.getElementsByTagName("testcase")[0]
|
||||
assert_attr(tnode,
|
||||
node = dom.find_first_by_tag("testsuite")
|
||||
node.assert_attr(failures=1, tests=2)
|
||||
tnode = node.find_first_by_tag("testcase")
|
||||
tnode.assert_attr(
|
||||
file="test_junit_prefixing.py",
|
||||
line="0",
|
||||
classname="xyz.test_junit_prefixing",
|
||||
name="test_func")
|
||||
tnode = node.getElementsByTagName("testcase")[1]
|
||||
assert_attr(tnode,
|
||||
tnode = node.find_nth_by_tag("testcase", 1)
|
||||
tnode.assert_attr(
|
||||
file="test_junit_prefixing.py",
|
||||
line="3",
|
||||
classname="xyz.test_junit_prefixing."
|
||||
"TestHello",
|
||||
"TestHello",
|
||||
name="test_hello")
|
||||
|
||||
def test_xfailure_function(self, testdir):
|
||||
|
@ -249,17 +302,17 @@ class TestPython:
|
|||
""")
|
||||
result, dom = runandparse(testdir)
|
||||
assert not result.ret
|
||||
node = dom.getElementsByTagName("testsuite")[0]
|
||||
assert_attr(node, skips=1, tests=0)
|
||||
tnode = node.getElementsByTagName("testcase")[0]
|
||||
assert_attr(tnode,
|
||||
node = dom.find_first_by_tag("testsuite")
|
||||
node.assert_attr(skips=1, tests=0)
|
||||
tnode = node.find_first_by_tag("testcase")
|
||||
tnode.assert_attr(
|
||||
file="test_xfailure_function.py",
|
||||
line="1",
|
||||
classname="test_xfailure_function",
|
||||
name="test_xfail")
|
||||
fnode = tnode.getElementsByTagName("skipped")[0]
|
||||
assert_attr(fnode, message="expected test failure")
|
||||
#assert "ValueError" in fnode.toxml()
|
||||
fnode = tnode.find_first_by_tag("skipped")
|
||||
fnode.assert_attr(message="expected test failure")
|
||||
# assert "ValueError" in fnode.toxml()
|
||||
|
||||
def test_xfailure_xpass(self, testdir):
|
||||
testdir.makepyfile("""
|
||||
|
@ -269,49 +322,50 @@ class TestPython:
|
|||
pass
|
||||
""")
|
||||
result, dom = runandparse(testdir)
|
||||
#assert result.ret
|
||||
node = dom.getElementsByTagName("testsuite")[0]
|
||||
assert_attr(node, skips=1, tests=0)
|
||||
tnode = node.getElementsByTagName("testcase")[0]
|
||||
assert_attr(tnode,
|
||||
# assert result.ret
|
||||
node = dom.find_first_by_tag("testsuite")
|
||||
node.assert_attr(skips=1, tests=0)
|
||||
tnode = node.find_first_by_tag("testcase")
|
||||
tnode.assert_attr(
|
||||
file="test_xfailure_xpass.py",
|
||||
line="1",
|
||||
classname="test_xfailure_xpass",
|
||||
name="test_xpass")
|
||||
fnode = tnode.getElementsByTagName("skipped")[0]
|
||||
assert_attr(fnode, message="xfail-marked test passes unexpectedly")
|
||||
#assert "ValueError" in fnode.toxml()
|
||||
fnode = tnode.find_first_by_tag("skipped")
|
||||
fnode.assert_attr(message="xfail-marked test passes unexpectedly")
|
||||
# assert "ValueError" in fnode.toxml()
|
||||
|
||||
def test_collect_error(self, testdir):
|
||||
testdir.makepyfile("syntax error")
|
||||
result, dom = runandparse(testdir)
|
||||
assert result.ret
|
||||
node = dom.getElementsByTagName("testsuite")[0]
|
||||
assert_attr(node, errors=1, tests=0)
|
||||
tnode = node.getElementsByTagName("testcase")[0]
|
||||
assert_attr(tnode,
|
||||
node = dom.find_first_by_tag("testsuite")
|
||||
node.assert_attr(errors=1, tests=0)
|
||||
tnode = node.find_first_by_tag("testcase")
|
||||
tnode.assert_attr(
|
||||
file="test_collect_error.py",
|
||||
#classname="test_collect_error",
|
||||
name="test_collect_error")
|
||||
assert tnode.getAttributeNode("line") is None
|
||||
fnode = tnode.getElementsByTagName("error")[0]
|
||||
assert_attr(fnode, message="collection failure")
|
||||
assert tnode["line"] is None
|
||||
fnode = tnode.find_first_by_tag("error")
|
||||
fnode.assert_attr(message="collection failure")
|
||||
assert "SyntaxError" in fnode.toxml()
|
||||
|
||||
def test_collect_skipped(self, testdir):
|
||||
testdir.makepyfile("import pytest; pytest.skip('xyz')")
|
||||
result, dom = runandparse(testdir)
|
||||
assert result.ret == EXIT_NOTESTSCOLLECTED
|
||||
node = dom.getElementsByTagName("testsuite")[0]
|
||||
assert_attr(node, skips=1, tests=0)
|
||||
tnode = node.getElementsByTagName("testcase")[0]
|
||||
assert_attr(tnode,
|
||||
node = dom.find_first_by_tag("testsuite")
|
||||
node.assert_attr(skips=1, tests=0)
|
||||
tnode = node.find_first_by_tag("testcase")
|
||||
tnode.assert_attr(
|
||||
file="test_collect_skipped.py",
|
||||
#classname="test_collect_error",
|
||||
name="test_collect_skipped")
|
||||
assert tnode.getAttributeNode("line") is None # py.test doesn't give us a line here.
|
||||
fnode = tnode.getElementsByTagName("skipped")[0]
|
||||
assert_attr(fnode, message="collection skipped")
|
||||
|
||||
# py.test doesn't give us a line here.
|
||||
assert tnode["line"] is None
|
||||
|
||||
fnode = tnode.find_first_by_tag("skipped")
|
||||
fnode.assert_attr(message="collection skipped")
|
||||
|
||||
def test_unicode(self, testdir):
|
||||
value = 'hx\xc4\x85\xc4\x87\n'
|
||||
|
@ -323,8 +377,8 @@ class TestPython:
|
|||
""" % value)
|
||||
result, dom = runandparse(testdir)
|
||||
assert result.ret == 1
|
||||
tnode = dom.getElementsByTagName("testcase")[0]
|
||||
fnode = tnode.getElementsByTagName("failure")[0]
|
||||
tnode = dom.find_first_by_tag("testcase")
|
||||
fnode = tnode.find_first_by_tag("failure")
|
||||
if not sys.platform.startswith("java"):
|
||||
assert "hx" in fnode.toxml()
|
||||
|
||||
|
@ -347,9 +401,9 @@ class TestPython:
|
|||
print('hello-stdout')
|
||||
""")
|
||||
result, dom = runandparse(testdir)
|
||||
node = dom.getElementsByTagName("testsuite")[0]
|
||||
pnode = node.getElementsByTagName("testcase")[0]
|
||||
systemout = pnode.getElementsByTagName("system-out")[0]
|
||||
node = dom.find_first_by_tag("testsuite")
|
||||
pnode = node.find_first_by_tag("testcase")
|
||||
systemout = pnode.find_first_by_tag("system-out")
|
||||
assert "hello-stdout" in systemout.toxml()
|
||||
|
||||
def test_pass_captures_stderr(self, testdir):
|
||||
|
@ -359,27 +413,32 @@ class TestPython:
|
|||
sys.stderr.write('hello-stderr')
|
||||
""")
|
||||
result, dom = runandparse(testdir)
|
||||
node = dom.getElementsByTagName("testsuite")[0]
|
||||
pnode = node.getElementsByTagName("testcase")[0]
|
||||
systemout = pnode.getElementsByTagName("system-err")[0]
|
||||
node = dom.find_first_by_tag("testsuite")
|
||||
pnode = node.find_first_by_tag("testcase")
|
||||
systemout = pnode.find_first_by_tag("system-err")
|
||||
assert "hello-stderr" in systemout.toxml()
|
||||
|
||||
|
||||
def test_mangle_testnames():
|
||||
from _pytest.junitxml import mangle_testnames
|
||||
names = ["a/pything.py", "Class", "()", "method"]
|
||||
newnames = mangle_testnames(names)
|
||||
assert newnames == ["a.pything", "Class", "method"]
|
||||
|
||||
|
||||
def test_dont_configure_on_slaves(tmpdir):
|
||||
gotten = []
|
||||
|
||||
class FakeConfig:
|
||||
def __init__(self):
|
||||
self.pluginmanager = self
|
||||
self.option = self
|
||||
|
||||
junitprefix = None
|
||||
#XXX: shouldnt need tmpdir ?
|
||||
# XXX: shouldnt need tmpdir ?
|
||||
xmlpath = str(tmpdir.join('junix.xml'))
|
||||
register = gotten.append
|
||||
|
||||
fake_config = FakeConfig()
|
||||
from _pytest import junitxml
|
||||
junitxml.pytest_configure(fake_config)
|
||||
|
@ -408,14 +467,12 @@ class TestNonPython:
|
|||
testdir.tmpdir.join("myfile.xyz").write("hello")
|
||||
result, dom = runandparse(testdir)
|
||||
assert result.ret
|
||||
node = dom.getElementsByTagName("testsuite")[0]
|
||||
assert_attr(node, errors=0, failures=1, skips=0, tests=1)
|
||||
tnode = node.getElementsByTagName("testcase")[0]
|
||||
assert_attr(tnode,
|
||||
#classname="test_collect_error",
|
||||
name="myfile.xyz")
|
||||
fnode = tnode.getElementsByTagName("failure")[0]
|
||||
assert_attr(fnode, message="custom item runtest failed")
|
||||
node = dom.find_first_by_tag("testsuite")
|
||||
node.assert_attr(errors=0, failures=1, skips=0, tests=1)
|
||||
tnode = node.find_first_by_tag("testcase")
|
||||
tnode.assert_attr(name="myfile.xyz")
|
||||
fnode = tnode.find_first_by_tag("failure")
|
||||
fnode.assert_attr(message="custom item runtest failed")
|
||||
assert "custom item runtest failed" in fnode.toxml()
|
||||
|
||||
|
||||
|
@ -449,6 +506,7 @@ def test_nullbyte_replace(testdir):
|
|||
text = xmlf.read()
|
||||
assert '#x0' in text
|
||||
|
||||
|
||||
def test_invalid_xml_escape():
|
||||
# Test some more invalid xml chars, the full range should be
|
||||
# tested really but let's just thest the edges of the ranges
|
||||
|
@ -463,14 +521,13 @@ def test_invalid_xml_escape():
|
|||
unichr(65)
|
||||
except NameError:
|
||||
unichr = chr
|
||||
invalid = (0x00, 0x1, 0xB, 0xC, 0xE, 0x19,
|
||||
27, # issue #126
|
||||
0xD800, 0xDFFF, 0xFFFE, 0x0FFFF) #, 0x110000)
|
||||
valid = (0x9, 0xA, 0x20,) # 0xD, 0xD7FF, 0xE000, 0xFFFD, 0x10000, 0x10FFFF)
|
||||
invalid = (0x00, 0x1, 0xB, 0xC, 0xE, 0x19, 27, # issue #126
|
||||
0xD800, 0xDFFF, 0xFFFE, 0x0FFFF) # , 0x110000)
|
||||
valid = (0x9, 0xA, 0x20, )
|
||||
# 0xD, 0xD7FF, 0xE000, 0xFFFD, 0x10000, 0x10FFFF)
|
||||
|
||||
from _pytest.junitxml import bin_xml_escape
|
||||
|
||||
|
||||
for i in invalid:
|
||||
got = bin_xml_escape(unichr(i)).uniobj
|
||||
if i <= 0xFF:
|
||||
|
@ -481,6 +538,7 @@ def test_invalid_xml_escape():
|
|||
for i in valid:
|
||||
assert chr(i) == bin_xml_escape(unichr(i)).uniobj
|
||||
|
||||
|
||||
def test_logxml_path_expansion(tmpdir, monkeypatch):
|
||||
home_tilde = py.path.local(os.path.expanduser('~')).join('test.xml')
|
||||
|
||||
|
@ -494,6 +552,7 @@ def test_logxml_path_expansion(tmpdir, monkeypatch):
|
|||
xml_var = LogXML('$HOME%stest.xml' % tmpdir.sep, None)
|
||||
assert xml_var.logfile == home_var
|
||||
|
||||
|
||||
def test_logxml_changingdir(testdir):
|
||||
testdir.makepyfile("""
|
||||
def test_func():
|
||||
|
@ -505,6 +564,7 @@ def test_logxml_changingdir(testdir):
|
|||
assert result.ret == 0
|
||||
assert testdir.tmpdir.join("a/x.xml").check()
|
||||
|
||||
|
||||
def test_logxml_makedir(testdir):
|
||||
"""--junitxml should automatically create directories for the xml file"""
|
||||
testdir.makepyfile("""
|
||||
|
@ -515,6 +575,7 @@ def test_logxml_makedir(testdir):
|
|||
assert result.ret == 0
|
||||
assert testdir.tmpdir.join("path/to/results.xml").check()
|
||||
|
||||
|
||||
def test_escaped_parametrized_names_xml(testdir):
|
||||
testdir.makepyfile("""
|
||||
import pytest
|
||||
|
@ -524,49 +585,57 @@ def test_escaped_parametrized_names_xml(testdir):
|
|||
""")
|
||||
result, dom = runandparse(testdir)
|
||||
assert result.ret == 0
|
||||
node = dom.getElementsByTagName("testcase")[0]
|
||||
assert_attr(node,
|
||||
name="test_func[#x00]")
|
||||
node = dom.find_first_by_tag("testcase")
|
||||
node.assert_attr(name="test_func[#x00]")
|
||||
|
||||
|
||||
def test_unicode_issue368(testdir):
|
||||
path = testdir.tmpdir.join("test.xml")
|
||||
log = LogXML(str(path), None)
|
||||
ustr = py.builtin._totext("ВНИ!", "utf-8")
|
||||
from _pytest.runner import BaseReport
|
||||
|
||||
class Report(BaseReport):
|
||||
longrepr = ustr
|
||||
sections = []
|
||||
nodeid = "something"
|
||||
location = 'tests/filename.py', 42, 'TestClass.method'
|
||||
report = Report()
|
||||
|
||||
test_report = Report()
|
||||
|
||||
# hopefully this is not too brittle ...
|
||||
log.pytest_sessionstart()
|
||||
log._opentestcase(report)
|
||||
log.append_failure(report)
|
||||
log.append_collect_error(report)
|
||||
log.append_collect_skipped(report)
|
||||
log.append_error(report)
|
||||
report.longrepr = "filename", 1, ustr
|
||||
log.append_skipped(report)
|
||||
report.longrepr = "filename", 1, "Skipped: 卡嘣嘣"
|
||||
log.append_skipped(report)
|
||||
report.wasxfail = ustr
|
||||
log.append_skipped(report)
|
||||
node_reporter = log._opentestcase(test_report)
|
||||
node_reporter.append_failure(test_report)
|
||||
node_reporter.append_collect_error(test_report)
|
||||
node_reporter.append_collect_skipped(test_report)
|
||||
node_reporter.append_error(test_report)
|
||||
test_report.longrepr = "filename", 1, ustr
|
||||
node_reporter.append_skipped(test_report)
|
||||
test_report.longrepr = "filename", 1, "Skipped: 卡嘣嘣"
|
||||
node_reporter.append_skipped(test_report)
|
||||
test_report.wasxfail = ustr
|
||||
node_reporter.append_skipped(test_report)
|
||||
log.pytest_sessionfinish()
|
||||
|
||||
|
||||
def test_record_property(testdir):
|
||||
testdir.makepyfile("""
|
||||
def test_record(record_xml_property):
|
||||
import pytest
|
||||
|
||||
@pytest.fixture
|
||||
def other(record_xml_property):
|
||||
record_xml_property("bar", 1)
|
||||
def test_record(record_xml_property, other):
|
||||
record_xml_property("foo", "<1");
|
||||
""")
|
||||
result, dom = runandparse(testdir, '-rw')
|
||||
node = dom.getElementsByTagName("testsuite")[0]
|
||||
tnode = node.getElementsByTagName("testcase")[0]
|
||||
psnode = tnode.getElementsByTagName('properties')[0]
|
||||
pnode = psnode.getElementsByTagName('property')[0]
|
||||
assert_attr(pnode, name="foo", value="<1")
|
||||
node = dom.find_first_by_tag("testsuite")
|
||||
tnode = node.find_first_by_tag("testcase")
|
||||
psnode = tnode.find_first_by_tag('properties')
|
||||
pnodes = psnode.find_by_tag('property')
|
||||
pnodes[0].assert_attr(name="bar", value="1")
|
||||
pnodes[1].assert_attr(name="foo", value="<1")
|
||||
result.stdout.fnmatch_lines('*C3*test_record_property.py*experimental*')
|
||||
|
||||
|
||||
|
@ -583,10 +652,33 @@ def test_random_report_log_xdist(testdir):
|
|||
assert i != 22
|
||||
""")
|
||||
_, dom = runandparse(testdir, '-n2')
|
||||
suite_node = dom.getElementsByTagName("testsuite")[0]
|
||||
suite_node = dom.find_first_by_tag("testsuite")
|
||||
failed = []
|
||||
for case_node in suite_node.getElementsByTagName("testcase"):
|
||||
if case_node.getElementsByTagName('failure'):
|
||||
failed.append(case_node.getAttributeNode('name').value)
|
||||
for case_node in suite_node.find_by_tag("testcase"):
|
||||
if case_node.find_first_by_tag('failure'):
|
||||
failed.append(case_node['name'])
|
||||
|
||||
assert failed == ['test_x[22]']
|
||||
|
||||
|
||||
def test_runs_twice(testdir):
|
||||
f = testdir.makepyfile('''
|
||||
def test_pass():
|
||||
pass
|
||||
''')
|
||||
|
||||
result = testdir.runpytest(f, f, '--junitxml', testdir.tmpdir.join("test.xml"))
|
||||
assert 'INTERNALERROR' not in str(result.stdout)
|
||||
|
||||
|
||||
def test_runs_twice_xdist(testdir):
|
||||
pytest.importorskip('xdist')
|
||||
f = testdir.makepyfile('''
|
||||
def test_pass():
|
||||
pass
|
||||
''')
|
||||
|
||||
result = testdir.runpytest(f,
|
||||
'--dist', 'each', '--tx', '2*popen',
|
||||
'--junitxml', testdir.tmpdir.join("test.xml"))
|
||||
assert 'INTERNALERROR' not in str(result.stdout)
|
|
@ -1,7 +1,8 @@
|
|||
# encoding: utf-8
|
||||
import sys
|
||||
import pytest
|
||||
|
||||
class TestPasting:
|
||||
class TestPasteCapture:
|
||||
|
||||
@pytest.fixture
|
||||
def pastebinlist(self, monkeypatch, request):
|
||||
|
@ -27,6 +28,7 @@ class TestPasting:
|
|||
assert reprec.countoutcomes() == [1,1,1]
|
||||
|
||||
def test_all(self, testdir, pastebinlist):
|
||||
from _pytest.pytester import LineMatcher
|
||||
testpath = testdir.makepyfile("""
|
||||
import pytest
|
||||
def test_pass():
|
||||
|
@ -39,9 +41,34 @@ class TestPasting:
|
|||
reprec = testdir.inline_run(testpath, "--pastebin=all", '-v')
|
||||
assert reprec.countoutcomes() == [1,1,1]
|
||||
assert len(pastebinlist) == 1
|
||||
s = pastebinlist[0]
|
||||
for x in 'test_fail test_skip test_pass'.split():
|
||||
assert x in s
|
||||
contents = pastebinlist[0].decode('utf-8')
|
||||
matcher = LineMatcher(contents.splitlines())
|
||||
matcher.fnmatch_lines([
|
||||
'*test_pass PASSED*',
|
||||
'*test_fail FAILED*',
|
||||
'*test_skip SKIPPED*',
|
||||
'*== 1 failed, 1 passed, 1 skipped in *'
|
||||
])
|
||||
|
||||
def test_non_ascii_paste_text(self, testdir):
|
||||
"""Make sure that text which contains non-ascii characters is pasted
|
||||
correctly. See #1219.
|
||||
"""
|
||||
testdir.makepyfile(test_unicode="""
|
||||
# encoding: utf-8
|
||||
def test():
|
||||
assert '☺' == 1
|
||||
""")
|
||||
result = testdir.runpytest('--pastebin=all')
|
||||
if sys.version_info[0] == 3:
|
||||
expected_msg = "*assert '☺' == 1*"
|
||||
else:
|
||||
expected_msg = "*assert '\\xe2\\x98\\xba' == 1*"
|
||||
result.stdout.fnmatch_lines([
|
||||
expected_msg,
|
||||
"*== 1 failed in *",
|
||||
'*Sending information to Paste Service*',
|
||||
])
|
||||
|
||||
|
||||
class TestPaste:
|
||||
|
@ -62,7 +89,7 @@ class TestPaste:
|
|||
class DummyFile:
|
||||
def read(self):
|
||||
# part of html of a normal response
|
||||
return 'View <a href="/raw/3c0c6750bd">raw</a>.'
|
||||
return b'View <a href="/raw/3c0c6750bd">raw</a>.'
|
||||
return DummyFile()
|
||||
|
||||
if sys.version_info < (3, 0):
|
||||
|
@ -74,14 +101,15 @@ class TestPaste:
|
|||
return calls
|
||||
|
||||
def test_create_new_paste(self, pastebin, mocked_urlopen):
|
||||
result = pastebin.create_new_paste('full-paste-contents')
|
||||
result = pastebin.create_new_paste(b'full-paste-contents')
|
||||
assert result == 'https://bpaste.net/show/3c0c6750bd'
|
||||
assert len(mocked_urlopen) == 1
|
||||
url, data = mocked_urlopen[0]
|
||||
assert type(data) is bytes
|
||||
lexer = 'python3' if sys.version_info[0] == 3 else 'python'
|
||||
assert url == 'https://bpaste.net'
|
||||
assert 'lexer=%s' % lexer in data
|
||||
assert 'code=full-paste-contents' in data
|
||||
assert 'expiry=1week' in data
|
||||
assert 'lexer=%s' % lexer in data.decode()
|
||||
assert 'code=full-paste-contents' in data.decode()
|
||||
assert 'expiry=1week' in data.decode()
|
||||
|
||||
|
||||
|
|
|
@ -63,33 +63,31 @@ class TestWarningsRecorderChecker(object):
|
|||
with rec:
|
||||
pass # can't enter twice
|
||||
|
||||
#
|
||||
# ============ test pytest.deprecated_call() ==============
|
||||
#
|
||||
|
||||
def dep(i):
|
||||
if i == 0:
|
||||
py.std.warnings.warn("is deprecated", DeprecationWarning)
|
||||
return 42
|
||||
|
||||
reg = {}
|
||||
def dep_explicit(i):
|
||||
if i == 0:
|
||||
py.std.warnings.warn_explicit("dep_explicit", category=DeprecationWarning,
|
||||
filename="hello", lineno=3)
|
||||
|
||||
class TestDeprecatedCall(object):
|
||||
"""test pytest.deprecated_call()"""
|
||||
|
||||
def dep(self, i, j=None):
|
||||
if i == 0:
|
||||
py.std.warnings.warn("is deprecated", DeprecationWarning,
|
||||
stacklevel=1)
|
||||
return 42
|
||||
|
||||
def dep_explicit(self, i):
|
||||
if i == 0:
|
||||
py.std.warnings.warn_explicit("dep_explicit", category=DeprecationWarning,
|
||||
filename="hello", lineno=3)
|
||||
|
||||
def test_deprecated_call_raises(self):
|
||||
excinfo = pytest.raises(AssertionError,
|
||||
"pytest.deprecated_call(dep, 3)")
|
||||
with pytest.raises(AssertionError) as excinfo:
|
||||
pytest.deprecated_call(self.dep, 3, 5)
|
||||
assert str(excinfo).find("did not produce") != -1
|
||||
|
||||
def test_deprecated_call(self):
|
||||
pytest.deprecated_call(dep, 0)
|
||||
pytest.deprecated_call(self.dep, 0, 5)
|
||||
|
||||
def test_deprecated_call_ret(self):
|
||||
ret = pytest.deprecated_call(dep, 0)
|
||||
ret = pytest.deprecated_call(self.dep, 0)
|
||||
assert ret == 42
|
||||
|
||||
def test_deprecated_call_preserves(self):
|
||||
|
@ -105,35 +103,58 @@ class TestDeprecatedCall(object):
|
|||
assert warn_explicit is py.std.warnings.warn_explicit
|
||||
|
||||
def test_deprecated_explicit_call_raises(self):
|
||||
pytest.raises(AssertionError,
|
||||
"pytest.deprecated_call(dep_explicit, 3)")
|
||||
with pytest.raises(AssertionError):
|
||||
pytest.deprecated_call(self.dep_explicit, 3)
|
||||
|
||||
def test_deprecated_explicit_call(self):
|
||||
pytest.deprecated_call(dep_explicit, 0)
|
||||
pytest.deprecated_call(dep_explicit, 0)
|
||||
pytest.deprecated_call(self.dep_explicit, 0)
|
||||
pytest.deprecated_call(self.dep_explicit, 0)
|
||||
|
||||
def test_deprecated_call_as_context_manager_no_warning(self):
|
||||
with pytest.raises(pytest.fail.Exception) as ex:
|
||||
with pytest.deprecated_call():
|
||||
dep(1)
|
||||
self.dep(1)
|
||||
assert str(ex.value) == "DID NOT WARN"
|
||||
|
||||
def test_deprecated_call_as_context_manager(self):
|
||||
with pytest.deprecated_call():
|
||||
dep(0)
|
||||
self.dep(0)
|
||||
|
||||
def test_deprecated_call_pending(self):
|
||||
f = lambda: py.std.warnings.warn(PendingDeprecationWarning("hi"))
|
||||
def f():
|
||||
py.std.warnings.warn(PendingDeprecationWarning("hi"))
|
||||
pytest.deprecated_call(f)
|
||||
|
||||
def test_deprecated_call_specificity(self):
|
||||
other_warnings = [Warning, UserWarning, SyntaxWarning, RuntimeWarning,
|
||||
FutureWarning, ImportWarning, UnicodeWarning]
|
||||
for warning in other_warnings:
|
||||
f = lambda: py.std.warnings.warn(warning("hi"))
|
||||
def f():
|
||||
py.std.warnings.warn(warning("hi"))
|
||||
with pytest.raises(AssertionError):
|
||||
pytest.deprecated_call(f)
|
||||
|
||||
def test_deprecated_function_already_called(self, testdir):
|
||||
"""deprecated_call should be able to catch a call to a deprecated
|
||||
function even if that function has already been called in the same
|
||||
module. See #1190.
|
||||
"""
|
||||
testdir.makepyfile("""
|
||||
import warnings
|
||||
import pytest
|
||||
|
||||
def deprecated_function():
|
||||
warnings.warn("deprecated", DeprecationWarning)
|
||||
|
||||
def test_one():
|
||||
deprecated_function()
|
||||
|
||||
def test_two():
|
||||
pytest.deprecated_call(deprecated_function)
|
||||
""")
|
||||
result = testdir.runpytest()
|
||||
result.stdout.fnmatch_lines('*=== 2 passed in *===')
|
||||
|
||||
|
||||
class TestWarns(object):
|
||||
def test_strings(self):
|
||||
|
|
|
@ -806,10 +806,10 @@ def test_terminal_summary(testdir):
|
|||
("green", "1 passed, 1 xpassed", {"xpassed": (1,), "passed": (1,)}),
|
||||
|
||||
# Likewise if no tests were found at all
|
||||
("yellow", "", {}),
|
||||
("yellow", "no tests ran", {}),
|
||||
|
||||
# Test the empty-key special case
|
||||
("yellow", "", {"": (1,)}),
|
||||
("yellow", "no tests ran", {"": (1,)}),
|
||||
("green", "1 passed", {"": (1,), "passed": (1,)}),
|
||||
|
||||
|
||||
|
|
|
@ -136,15 +136,20 @@ def test_tmpdir_fallback_tox_env(testdir, monkeypatch):
|
|||
reprec.assertoutcome(passed=1)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def break_getuser(monkeypatch):
|
||||
monkeypatch.setattr('os.getuid', lambda: -1)
|
||||
# taken from python 2.7/3.4
|
||||
for envvar in ('LOGNAME', 'USER', 'LNAME', 'USERNAME'):
|
||||
monkeypatch.delenv(envvar, raising=False)
|
||||
|
||||
|
||||
@pytest.mark.usefixtures("break_getuser")
|
||||
@pytest.mark.skipif(sys.platform.startswith('win'), reason='no os.getuid on windows')
|
||||
def test_tmpdir_fallback_uid_not_found(testdir, monkeypatch):
|
||||
def test_tmpdir_fallback_uid_not_found(testdir):
|
||||
"""Test that tmpdir works even if the current process's user id does not
|
||||
correspond to a valid user.
|
||||
"""
|
||||
import os
|
||||
monkeypatch.setattr(os, 'getuid', lambda: -1)
|
||||
monkeypatch.delenv('USER', raising=False)
|
||||
monkeypatch.delenv('USERNAME', raising=False)
|
||||
|
||||
testdir.makepyfile("""
|
||||
import pytest
|
||||
|
@ -155,17 +160,13 @@ def test_tmpdir_fallback_uid_not_found(testdir, monkeypatch):
|
|||
reprec.assertoutcome(passed=1)
|
||||
|
||||
|
||||
@pytest.mark.usefixtures("break_getuser")
|
||||
@pytest.mark.skipif(sys.platform.startswith('win'), reason='no os.getuid on windows')
|
||||
def test_get_user_uid_not_found(monkeypatch):
|
||||
def test_get_user_uid_not_found():
|
||||
"""Test that get_user() function works even if the current process's
|
||||
user id does not correspond to a valid user (e.g. running pytest in a
|
||||
Docker container with 'docker run -u'.
|
||||
"""
|
||||
import os
|
||||
monkeypatch.setattr(os, 'getuid', lambda: -1)
|
||||
monkeypatch.delenv('USER', raising=False)
|
||||
monkeypatch.delenv('USERNAME', raising=False)
|
||||
|
||||
from _pytest.tmpdir import get_user
|
||||
assert get_user() is None
|
||||
|
||||
|
|
|
@ -718,3 +718,19 @@ def test_unittest_raise_skip_issue748(testdir):
|
|||
*SKIP*[1]*test_foo.py*skipping due to reasons*
|
||||
*1 skipped*
|
||||
""")
|
||||
|
||||
@pytest.mark.skipif("sys.version_info < (2,7)")
|
||||
def test_unittest_skip_issue1169(testdir):
|
||||
testdir.makepyfile(test_foo="""
|
||||
import unittest
|
||||
|
||||
class MyTestCase(unittest.TestCase):
|
||||
@unittest.skip("skipping due to reasons")
|
||||
def test_skip(self):
|
||||
self.fail()
|
||||
""")
|
||||
result = testdir.runpytest("-v", '-rs')
|
||||
result.stdout.fnmatch_lines("""
|
||||
*SKIP*[1]*skipping due to reasons*
|
||||
*1 skipped*
|
||||
""")
|
||||
|
|
Loading…
Reference in New Issue