Merge remote-tracking branch 'upstream/master' into merge-master-into-features
This commit is contained in:
commit
9646a1cd7a
|
@ -3,7 +3,7 @@ Thanks for submitting a PR, your contribution is really appreciated!
|
|||
Here's a quick checklist that should be present in PRs (you can delete this text from the final description, this is
|
||||
just a guideline):
|
||||
|
||||
- [ ] Create a new changelog file in the `changelog` folder, with a name like `<ISSUE NUMBER>.<TYPE>.rst`. See [changelog/README.rst](/changelog/README.rst) for details.
|
||||
- [ ] Create a new changelog file in the `changelog` folder, with a name like `<ISSUE NUMBER>.<TYPE>.rst`. See [changelog/README.rst](https://github.com/pytest-dev/pytest/blob/master/changelog/README.rst) for details.
|
||||
- [ ] Target the `master` branch for bug fixes, documentation updates and trivial changes.
|
||||
- [ ] Target the `features` branch for new features and removals/deprecations.
|
||||
- [ ] Include documentation when adding new features.
|
||||
|
|
20
.travis.yml
20
.travis.yml
|
@ -12,19 +12,15 @@ install:
|
|||
- pip install --upgrade --pre tox
|
||||
env:
|
||||
matrix:
|
||||
# note: please use "tox --listenvs" to populate the build matrix below
|
||||
# please remove the linting env in all cases
|
||||
- TOXENV=py27-pexpect
|
||||
- TOXENV=py27-xdist
|
||||
- TOXENV=py27-trial
|
||||
- TOXENV=py27-numpy
|
||||
- TOXENV=py27-pluggymaster PYTEST_NO_COVERAGE=1
|
||||
- TOXENV=py36-pexpect
|
||||
- TOXENV=py36-xdist
|
||||
- TOXENV=py36-trial
|
||||
- TOXENV=py36-numpy
|
||||
- TOXENV=py36-pluggymaster PYTEST_NO_COVERAGE=1
|
||||
# Specialized factors for py27.
|
||||
- TOXENV=py27-pexpect,py27-trial,py27-numpy
|
||||
- TOXENV=py27-nobyte
|
||||
- TOXENV=py27-xdist
|
||||
- TOXENV=py27-pluggymaster PYTEST_NO_COVERAGE=1
|
||||
# Specialized factors for py36.
|
||||
- TOXENV=py36-pexpect,py36-trial,py36-numpy
|
||||
- TOXENV=py36-xdist
|
||||
- TOXENV=py36-pluggymaster PYTEST_NO_COVERAGE=1
|
||||
|
||||
jobs:
|
||||
include:
|
||||
|
|
1
AUTHORS
1
AUTHORS
|
@ -211,6 +211,7 @@ Thomas Hisch
|
|||
Tim Strazny
|
||||
Tom Dalton
|
||||
Tom Viner
|
||||
Tomer Keren
|
||||
Trevor Bekolay
|
||||
Tyler Goodlet
|
||||
Tzu-ping Chung
|
||||
|
|
|
@ -280,6 +280,47 @@ Here is a simple overview, with pytest-specific bits:
|
|||
base: features # if it's a feature
|
||||
|
||||
|
||||
Writing Tests
|
||||
----------------------------
|
||||
|
||||
Writing tests for plugins or for pytest itself is often done using the `testdir fixture <https://docs.pytest.org/en/latest/reference.html#testdir>`_, as a "black-box" test.
|
||||
|
||||
For example, to ensure a simple test passes you can write:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
def test_true_assertion(testdir):
|
||||
testdir.makepyfile(
|
||||
"""
|
||||
def test_foo():
|
||||
assert True
|
||||
"""
|
||||
)
|
||||
result = testdir.runpytest()
|
||||
result.assert_outcomes(failed=0, passed=1)
|
||||
|
||||
|
||||
Alternatively, it is possible to make checks based on the actual output of the termal using
|
||||
*glob-like* expressions:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
def test_true_assertion(testdir):
|
||||
testdir.makepyfile(
|
||||
"""
|
||||
def test_foo():
|
||||
assert False
|
||||
"""
|
||||
)
|
||||
result = testdir.runpytest()
|
||||
result.stdout.fnmatch_lines(["*assert False*", "*1 failed*"])
|
||||
|
||||
When choosing a file where to write a new test, take a look at the existing files and see if there's
|
||||
one file which looks like a good fit. For example, a regression test about a bug in the ``--lf`` option
|
||||
should go into ``test_cacheprovider.py``, given that this option is implemented in ``cacheprovider.py``.
|
||||
If in doubt, go ahead and open a PR with your best guess and we can discuss this over the code.
|
||||
|
||||
|
||||
Joining the Development Team
|
||||
----------------------------
|
||||
|
||||
|
|
26
appveyor.yml
26
appveyor.yml
|
@ -1,27 +1,29 @@
|
|||
environment:
|
||||
matrix:
|
||||
- TOXENV: "linting,docs,doctesting"
|
||||
PYTEST_NO_COVERAGE: "1"
|
||||
- TOXENV: "py27"
|
||||
- TOXENV: "py34"
|
||||
- TOXENV: "py35"
|
||||
- TOXENV: "py36"
|
||||
- TOXENV: "py37"
|
||||
PYTEST_NO_COVERAGE: "1"
|
||||
- TOXENV: "linting,docs,doctesting"
|
||||
- TOXENV: "py36"
|
||||
- TOXENV: "py35"
|
||||
- TOXENV: "py34"
|
||||
- TOXENV: "pypy"
|
||||
PYTEST_NO_COVERAGE: "1"
|
||||
- TOXENV: "py27-xdist"
|
||||
- TOXENV: "py27-trial"
|
||||
- TOXENV: "py27-numpy"
|
||||
# Specialized factors for py27.
|
||||
- TOXENV: "py27-trial,py27-numpy,py27-nobyte"
|
||||
- TOXENV: "py27-pluggymaster"
|
||||
PYTEST_NO_COVERAGE: "1"
|
||||
- TOXENV: "py36-xdist"
|
||||
- TOXENV: "py36-trial"
|
||||
- TOXENV: "py36-numpy"
|
||||
- TOXENV: "py27-xdist"
|
||||
# Specialized factors for py36.
|
||||
- TOXENV: "py36-trial,py36-numpy"
|
||||
- TOXENV: "py36-pluggymaster"
|
||||
PYTEST_NO_COVERAGE: "1"
|
||||
- TOXENV: "py27-nobyte"
|
||||
- TOXENV: "py36-freeze"
|
||||
PYTEST_NO_COVERAGE: "1"
|
||||
- TOXENV: "py36-xdist"
|
||||
|
||||
matrix:
|
||||
fast_finish: true
|
||||
|
||||
install:
|
||||
- echo Installed Pythons
|
||||
|
|
|
@ -0,0 +1 @@
|
|||
Exclude 0.00 second entries from ``--duration`` output unless ``-vv`` is passed on the command-line.
|
|
@ -0,0 +1 @@
|
|||
Fix duplicate printing of internal errors when using ``--pdb``.
|
|
@ -0,0 +1 @@
|
|||
Add tempir testing example to CONTRIBUTING.rst guide
|
|
@ -0,0 +1 @@
|
|||
Display the filename when encountering ``SyntaxWarning``.
|
|
@ -304,7 +304,7 @@ This form of test function doesn't support fixtures properly, and users should s
|
|||
.. code-block:: python
|
||||
|
||||
@pytest.mark.parametrize("x, y", [(2, 4), (3, 9)])
|
||||
def test_squared():
|
||||
def test_squared(x, y):
|
||||
assert x ** x == y
|
||||
|
||||
|
||||
|
|
|
@ -269,6 +269,7 @@ To get a list of the slowest 10 test durations::
|
|||
|
||||
pytest --durations=10
|
||||
|
||||
By default, pytest will not show test durations that are too small (<0.01s) unless ``-vv`` is passed on the command-line.
|
||||
|
||||
Creating JUnitXML format files
|
||||
----------------------------------------------------
|
||||
|
|
|
@ -399,7 +399,7 @@ def _rewrite_test(config, fn):
|
|||
finally:
|
||||
del state._indecode
|
||||
try:
|
||||
tree = ast.parse(source)
|
||||
tree = ast.parse(source, filename=fn.strpath)
|
||||
except SyntaxError:
|
||||
# Let this pop up again in the real import.
|
||||
state.trace("failed to parse: %r" % (fn,))
|
||||
|
|
|
@ -1,10 +1,12 @@
|
|||
""" interactive debugging with PDB, the Python Debugger. """
|
||||
from __future__ import absolute_import, division, print_function
|
||||
|
||||
import os
|
||||
import pdb
|
||||
import sys
|
||||
import os
|
||||
from doctest import UnexpectedException
|
||||
|
||||
from _pytest import outcomes
|
||||
from _pytest.config import hookimpl
|
||||
|
||||
try:
|
||||
|
@ -109,9 +111,6 @@ class PdbInvoke(object):
|
|||
_enter_pdb(node, call.excinfo, report)
|
||||
|
||||
def pytest_internalerror(self, excrepr, excinfo):
|
||||
for line in str(excrepr).split("\n"):
|
||||
sys.stderr.write("INTERNALERROR> %s\n" % line)
|
||||
sys.stderr.flush()
|
||||
tb = _postmortem_traceback(excinfo)
|
||||
post_mortem(tb)
|
||||
|
||||
|
@ -164,8 +163,9 @@ def _enter_pdb(node, excinfo, rep):
|
|||
rep.toterminal(tw)
|
||||
tw.sep(">", "entering PDB")
|
||||
tb = _postmortem_traceback(excinfo)
|
||||
post_mortem(tb)
|
||||
rep._pdbshown = True
|
||||
if post_mortem(tb):
|
||||
outcomes.exit("Quitting debugger")
|
||||
return rep
|
||||
|
||||
|
||||
|
@ -196,3 +196,4 @@ def post_mortem(t):
|
|||
p = Pdb()
|
||||
p.reset()
|
||||
p.interaction(None, t)
|
||||
return p.quitting
|
||||
|
|
|
@ -1371,7 +1371,6 @@ class FixtureManager(object):
|
|||
fixturedefs = self._arg2fixturedefs[argname]
|
||||
except KeyError:
|
||||
return None
|
||||
else:
|
||||
return tuple(self._matchfactories(fixturedefs, nodeid))
|
||||
|
||||
def _matchfactories(self, fixturedefs, nodeid):
|
||||
|
|
|
@ -570,9 +570,7 @@ class Session(nodes.FSCollector):
|
|||
return True
|
||||
|
||||
def _tryconvertpyarg(self, x):
|
||||
"""Convert a dotted module name to path.
|
||||
|
||||
"""
|
||||
"""Convert a dotted module name to path."""
|
||||
try:
|
||||
with _patched_find_module():
|
||||
loader = pkgutil.find_loader(x)
|
||||
|
@ -604,7 +602,6 @@ class Session(nodes.FSCollector):
|
|||
raise UsageError(
|
||||
"file or package not found: " + arg + " (missing __init__.py?)"
|
||||
)
|
||||
else:
|
||||
raise UsageError("file not found: " + arg)
|
||||
parts[0] = path
|
||||
return parts
|
||||
|
|
|
@ -17,7 +17,7 @@ from weakref import WeakKeyDictionary
|
|||
|
||||
from _pytest.capture import MultiCapture, SysCapture
|
||||
from _pytest._code import Source
|
||||
from _pytest.main import Session, EXIT_OK
|
||||
from _pytest.main import Session, EXIT_INTERRUPTED, EXIT_OK
|
||||
from _pytest.assertion.rewrite import AssertionRewritingHook
|
||||
from _pytest.pathlib import Path
|
||||
from _pytest.compat import safe_str
|
||||
|
@ -857,7 +857,7 @@ class Testdir(object):
|
|||
|
||||
# typically we reraise keyboard interrupts from the child run
|
||||
# because it's our user requesting interruption of the testing
|
||||
if ret == 2 and not kwargs.get("no_reraise_ctrlc"):
|
||||
if ret == EXIT_INTERRUPTED and not kwargs.get("no_reraise_ctrlc"):
|
||||
calls = reprec.getcalls("pytest_keyboard_interrupt")
|
||||
if calls and calls[-1].excinfo.type == KeyboardInterrupt:
|
||||
raise KeyboardInterrupt()
|
||||
|
|
|
@ -30,6 +30,7 @@ def pytest_addoption(parser):
|
|||
|
||||
def pytest_terminal_summary(terminalreporter):
|
||||
durations = terminalreporter.config.option.durations
|
||||
verbose = terminalreporter.config.getvalue("verbose")
|
||||
if durations is None:
|
||||
return
|
||||
tr = terminalreporter
|
||||
|
@ -49,6 +50,10 @@ def pytest_terminal_summary(terminalreporter):
|
|||
dlist = dlist[:durations]
|
||||
|
||||
for rep in dlist:
|
||||
if verbose < 2 and rep.duration < 0.005:
|
||||
tr.write_line("")
|
||||
tr.write_line("(0.00 durations hidden. Use -vv to show these durations.)")
|
||||
break
|
||||
nodeid = rep.nodeid.replace("::()::", "::")
|
||||
tr.write_line("%02.2fs %-8s %s" % (rep.duration, rep.when, nodeid))
|
||||
|
||||
|
|
|
@ -12,6 +12,13 @@ import pytest
|
|||
from _pytest.main import EXIT_NOTESTSCOLLECTED, EXIT_USAGEERROR
|
||||
|
||||
|
||||
def prepend_pythonpath(*dirs):
|
||||
cur = os.getenv("PYTHONPATH")
|
||||
if cur:
|
||||
dirs += (cur,)
|
||||
return os.pathsep.join(str(p) for p in dirs)
|
||||
|
||||
|
||||
class TestGeneralUsage(object):
|
||||
def test_config_error(self, testdir):
|
||||
testdir.copy_example("conftest_usageerror/conftest.py")
|
||||
|
@ -590,14 +597,8 @@ class TestInvocationVariants(object):
|
|||
assert result.ret == 0
|
||||
result.stdout.fnmatch_lines(["*1 passed*"])
|
||||
|
||||
def join_pythonpath(what):
|
||||
cur = os.environ.get("PYTHONPATH")
|
||||
if cur:
|
||||
return str(what) + os.pathsep + cur
|
||||
return what
|
||||
|
||||
empty_package = testdir.mkpydir("empty_package")
|
||||
monkeypatch.setenv("PYTHONPATH", str(join_pythonpath(empty_package)))
|
||||
monkeypatch.setenv("PYTHONPATH", str(empty_package), prepend=os.pathsep)
|
||||
# the path which is not a package raises a warning on pypy;
|
||||
# no idea why only pypy and not normal python warn about it here
|
||||
with warnings.catch_warnings():
|
||||
|
@ -606,7 +607,7 @@ class TestInvocationVariants(object):
|
|||
assert result.ret == 0
|
||||
result.stdout.fnmatch_lines(["*2 passed*"])
|
||||
|
||||
monkeypatch.setenv("PYTHONPATH", str(join_pythonpath(testdir)))
|
||||
monkeypatch.setenv("PYTHONPATH", str(testdir), prepend=os.pathsep)
|
||||
result = testdir.runpytest("--pyargs", "tpkg.test_missing", syspathinsert=True)
|
||||
assert result.ret != 0
|
||||
result.stderr.fnmatch_lines(["*not*found*test_missing*"])
|
||||
|
@ -646,18 +647,13 @@ class TestInvocationVariants(object):
|
|||
# ├── __init__.py
|
||||
# └── test_world.py
|
||||
|
||||
def join_pythonpath(*dirs):
|
||||
cur = os.environ.get("PYTHONPATH")
|
||||
if cur:
|
||||
dirs += (cur,)
|
||||
return os.pathsep.join(str(p) for p in dirs)
|
||||
|
||||
monkeypatch.setenv("PYTHONPATH", join_pythonpath(*search_path))
|
||||
# NOTE: the different/reversed ordering is intentional here.
|
||||
monkeypatch.setenv("PYTHONPATH", prepend_pythonpath(*search_path))
|
||||
for p in search_path:
|
||||
monkeypatch.syspath_prepend(p)
|
||||
|
||||
# mixed module and filenames:
|
||||
os.chdir("world")
|
||||
monkeypatch.chdir("world")
|
||||
result = testdir.runpytest("--pyargs", "-v", "ns_pkg.hello", "ns_pkg/world")
|
||||
assert result.ret == 0
|
||||
result.stdout.fnmatch_lines(
|
||||
|
@ -708,8 +704,6 @@ class TestInvocationVariants(object):
|
|||
pytest.skip(six.text_type(e.args[0]))
|
||||
monkeypatch.delenv("PYTHONDONTWRITEBYTECODE", raising=False)
|
||||
|
||||
search_path = ["lib", os.path.join("local", "lib")]
|
||||
|
||||
dirname = "lib"
|
||||
d = testdir.mkdir(dirname)
|
||||
foo = d.mkdir("foo")
|
||||
|
@ -742,13 +736,9 @@ class TestInvocationVariants(object):
|
|||
# ├── conftest.py
|
||||
# └── test_bar.py
|
||||
|
||||
def join_pythonpath(*dirs):
|
||||
cur = os.getenv("PYTHONPATH")
|
||||
if cur:
|
||||
dirs += (cur,)
|
||||
return os.pathsep.join(str(p) for p in dirs)
|
||||
|
||||
monkeypatch.setenv("PYTHONPATH", join_pythonpath(*search_path))
|
||||
# NOTE: the different/reversed ordering is intentional here.
|
||||
search_path = ["lib", os.path.join("local", "lib")]
|
||||
monkeypatch.setenv("PYTHONPATH", prepend_pythonpath(*search_path))
|
||||
for p in search_path:
|
||||
monkeypatch.syspath_prepend(p)
|
||||
|
||||
|
@ -768,8 +758,8 @@ class TestInvocationVariants(object):
|
|||
else:
|
||||
result.stdout.fnmatch_lines(
|
||||
[
|
||||
"local/lib/foo/bar/test_bar.py::test_bar PASSED*",
|
||||
"local/lib/foo/bar/test_bar.py::test_other PASSED*",
|
||||
"*lib/foo/bar/test_bar.py::test_bar PASSED*",
|
||||
"*lib/foo/bar/test_bar.py::test_other PASSED*",
|
||||
"*2 passed*",
|
||||
]
|
||||
)
|
||||
|
@ -846,7 +836,10 @@ class TestDurations(object):
|
|||
result = testdir.runpytest("--durations=10")
|
||||
assert result.ret == 0
|
||||
result.stdout.fnmatch_lines_random(
|
||||
["*durations*", "*call*test_3*", "*call*test_2*", "*call*test_1*"]
|
||||
["*durations*", "*call*test_3*", "*call*test_2*"]
|
||||
)
|
||||
result.stdout.fnmatch_lines(
|
||||
["(0.00 durations hidden. Use -vv to show these durations.)"]
|
||||
)
|
||||
|
||||
def test_calls_show_2(self, testdir):
|
||||
|
@ -860,6 +853,18 @@ class TestDurations(object):
|
|||
testdir.makepyfile(self.source)
|
||||
result = testdir.runpytest("--durations=0")
|
||||
assert result.ret == 0
|
||||
for x in "23":
|
||||
for y in ("call",): # 'setup', 'call', 'teardown':
|
||||
for line in result.stdout.lines:
|
||||
if ("test_%s" % x) in line and y in line:
|
||||
break
|
||||
else:
|
||||
raise AssertionError("not found {} {}".format(x, y))
|
||||
|
||||
def test_calls_showall_verbose(self, testdir):
|
||||
testdir.makepyfile(self.source)
|
||||
result = testdir.runpytest("--durations=0", "-vv")
|
||||
assert result.ret == 0
|
||||
for x in "123":
|
||||
for y in ("call",): # 'setup', 'call', 'teardown':
|
||||
for line in result.stdout.lines:
|
||||
|
@ -870,9 +875,9 @@ class TestDurations(object):
|
|||
|
||||
def test_with_deselected(self, testdir):
|
||||
testdir.makepyfile(self.source)
|
||||
result = testdir.runpytest("--durations=2", "-k test_1")
|
||||
result = testdir.runpytest("--durations=2", "-k test_2")
|
||||
assert result.ret == 0
|
||||
result.stdout.fnmatch_lines(["*durations*", "*call*test_1*"])
|
||||
result.stdout.fnmatch_lines(["*durations*", "*call*test_2*"])
|
||||
|
||||
def test_with_failing_collection(self, testdir):
|
||||
testdir.makepyfile(self.source)
|
||||
|
@ -892,13 +897,15 @@ class TestDurations(object):
|
|||
|
||||
class TestDurationWithFixture(object):
|
||||
source = """
|
||||
import pytest
|
||||
import time
|
||||
frag = 0.001
|
||||
def setup_function(func):
|
||||
time.sleep(frag * 3)
|
||||
def test_1():
|
||||
time.sleep(frag*2)
|
||||
def test_2():
|
||||
frag = 0.01
|
||||
|
||||
@pytest.fixture
|
||||
def setup_fixt():
|
||||
time.sleep(frag)
|
||||
|
||||
def test_1(setup_fixt):
|
||||
time.sleep(frag)
|
||||
"""
|
||||
|
||||
|
|
|
@ -494,6 +494,12 @@ class TestRequestBasic(object):
|
|||
reason="this method of test doesn't work on pypy",
|
||||
)
|
||||
def test_request_garbage(self, testdir):
|
||||
try:
|
||||
import xdist # noqa
|
||||
except ImportError:
|
||||
pass
|
||||
else:
|
||||
pytest.xfail("this test is flaky when executed with xdist")
|
||||
testdir.makepyfile(
|
||||
"""
|
||||
import sys
|
||||
|
|
|
@ -25,6 +25,8 @@ def custom_pdb_calls():
|
|||
|
||||
# install dummy debugger class and track which methods were called on it
|
||||
class _CustomPdb(object):
|
||||
quitting = False
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
called.append("init")
|
||||
|
||||
|
@ -142,6 +144,9 @@ class TestPDB(object):
|
|||
def test_1():
|
||||
i = 0
|
||||
assert i == 1
|
||||
|
||||
def test_not_called_due_to_quit():
|
||||
pass
|
||||
"""
|
||||
)
|
||||
child = testdir.spawn_pytest("--pdb %s" % p1)
|
||||
|
@ -150,8 +155,9 @@ class TestPDB(object):
|
|||
child.expect("Pdb")
|
||||
child.sendeof()
|
||||
rest = child.read().decode("utf8")
|
||||
assert "1 failed" in rest
|
||||
assert "= 1 failed in" in rest
|
||||
assert "def test_1" not in rest
|
||||
assert "Exit: Quitting debugger" in rest
|
||||
self.flush(child)
|
||||
|
||||
@staticmethod
|
||||
|
@ -321,7 +327,7 @@ class TestPDB(object):
|
|||
child = testdir.spawn_pytest("--pdb %s" % p1)
|
||||
# child.expect(".*import pytest.*")
|
||||
child.expect("Pdb")
|
||||
child.sendeof()
|
||||
child.sendline("c")
|
||||
child.expect("1 error")
|
||||
self.flush(child)
|
||||
|
||||
|
@ -334,8 +340,20 @@ class TestPDB(object):
|
|||
)
|
||||
p1 = testdir.makepyfile("def test_func(): pass")
|
||||
child = testdir.spawn_pytest("--pdb %s" % p1)
|
||||
# child.expect(".*import pytest.*")
|
||||
child.expect("Pdb")
|
||||
|
||||
# INTERNALERROR is only displayed once via terminal reporter.
|
||||
assert (
|
||||
len(
|
||||
[
|
||||
x
|
||||
for x in child.before.decode().splitlines()
|
||||
if x.startswith("INTERNALERROR> Traceback")
|
||||
]
|
||||
)
|
||||
== 1
|
||||
)
|
||||
|
||||
child.sendeof()
|
||||
self.flush(child)
|
||||
|
||||
|
@ -376,6 +394,7 @@ class TestPDB(object):
|
|||
rest = child.read().decode("utf8")
|
||||
assert "1 failed" in rest
|
||||
assert "reading from stdin while output" not in rest
|
||||
assert "BdbQuit" in rest
|
||||
self.flush(child)
|
||||
|
||||
def test_pdb_and_capsys(self, testdir):
|
||||
|
@ -518,14 +537,16 @@ class TestPDB(object):
|
|||
def test_pdb_collection_failure_is_shown(self, testdir):
|
||||
p1 = testdir.makepyfile("xxx")
|
||||
result = testdir.runpytest_subprocess("--pdb", p1)
|
||||
result.stdout.fnmatch_lines(["*NameError*xxx*", "*1 error*"])
|
||||
result.stdout.fnmatch_lines(
|
||||
["E NameError: *xxx*", "*! *Exit: Quitting debugger !*"] # due to EOF
|
||||
)
|
||||
|
||||
def test_enter_pdb_hook_is_called(self, testdir):
|
||||
testdir.makeconftest(
|
||||
"""
|
||||
def pytest_enter_pdb(config):
|
||||
assert config.testing_verification == 'configured'
|
||||
print 'enter_pdb_hook'
|
||||
print('enter_pdb_hook')
|
||||
|
||||
def pytest_configure(config):
|
||||
config.testing_verification = 'configured'
|
||||
|
@ -562,7 +583,7 @@ class TestPDB(object):
|
|||
custom_pdb="""
|
||||
class CustomPdb(object):
|
||||
def set_trace(*args, **kwargs):
|
||||
print 'custom set_trace>'
|
||||
print('custom set_trace>')
|
||||
"""
|
||||
)
|
||||
p1 = testdir.makepyfile(
|
||||
|
|
27
tox.ini
27
tox.ini
|
@ -18,10 +18,10 @@ envlist =
|
|||
|
||||
[testenv]
|
||||
commands =
|
||||
{env:_PYTEST_TOX_COVERAGE_RUN:} pytest --lsof -ra {posargs:testing}
|
||||
{env:_PYTEST_TOX_COVERAGE_RUN:} pytest --lsof
|
||||
coverage: coverage combine
|
||||
coverage: coverage report
|
||||
passenv = USER USERNAME
|
||||
passenv = USER USERNAME COVERAGE_*
|
||||
setenv =
|
||||
# configuration if a user runs tox with a "coverage" factor, for example "tox -e py36-coverage"
|
||||
coverage: _PYTEST_TOX_COVERAGE_RUN=coverage run -m
|
||||
|
@ -36,14 +36,12 @@ deps =
|
|||
{env:_PYTEST_TOX_EXTRA_DEP:}
|
||||
|
||||
[testenv:py27-subprocess]
|
||||
changedir = .
|
||||
deps =
|
||||
pytest-xdist>=1.13
|
||||
py27: mock
|
||||
nose
|
||||
passenv = USER USERNAME TRAVIS
|
||||
commands =
|
||||
pytest -n auto -ra --runpytest=subprocess {posargs:testing}
|
||||
pytest -n auto --runpytest=subprocess
|
||||
|
||||
|
||||
[testenv:linting]
|
||||
|
@ -59,9 +57,8 @@ deps =
|
|||
nose
|
||||
hypothesis>=3.56
|
||||
{env:_PYTEST_TOX_EXTRA_DEP:}
|
||||
passenv = USER USERNAME TRAVIS
|
||||
commands =
|
||||
{env:_PYTEST_TOX_COVERAGE_RUN:} pytest -n auto -ra {posargs:testing}
|
||||
{env:_PYTEST_TOX_COVERAGE_RUN:} pytest -n auto
|
||||
|
||||
[testenv:py36-xdist]
|
||||
# NOTE: copied from above due to https://github.com/tox-dev/tox/issues/706.
|
||||
|
@ -74,16 +71,14 @@ deps =
|
|||
commands = {[testenv:py27-xdist]commands}
|
||||
|
||||
[testenv:py27-pexpect]
|
||||
changedir = testing
|
||||
platform = linux|darwin
|
||||
deps =
|
||||
pexpect
|
||||
{env:_PYTEST_TOX_EXTRA_DEP:}
|
||||
commands =
|
||||
{env:_PYTEST_TOX_COVERAGE_RUN:} pytest -ra test_pdb.py test_terminal.py test_unittest.py
|
||||
{env:_PYTEST_TOX_COVERAGE_RUN:} pytest testing/test_pdb.py testing/test_terminal.py testing/test_unittest.py {posargs}
|
||||
|
||||
[testenv:py36-pexpect]
|
||||
changedir = {[testenv:py27-pexpect]changedir}
|
||||
platform = {[testenv:py27-pexpect]platform}
|
||||
deps = {[testenv:py27-pexpect]deps}
|
||||
commands = {[testenv:py27-pexpect]commands}
|
||||
|
@ -95,20 +90,18 @@ deps =
|
|||
py27: mock
|
||||
{env:_PYTEST_TOX_EXTRA_DEP:}
|
||||
distribute = true
|
||||
changedir=testing
|
||||
setenv =
|
||||
{[testenv]setenv}
|
||||
PYTHONDONTWRITEBYTECODE=1
|
||||
passenv = USER USERNAME TRAVIS
|
||||
commands =
|
||||
{env:_PYTEST_TOX_COVERAGE_RUN:} pytest -n auto -ra {posargs:.}
|
||||
{env:_PYTEST_TOX_COVERAGE_RUN:} pytest -n auto {posargs}
|
||||
|
||||
[testenv:py27-trial]
|
||||
deps =
|
||||
twisted
|
||||
{env:_PYTEST_TOX_EXTRA_DEP:}
|
||||
commands =
|
||||
{env:_PYTEST_TOX_COVERAGE_RUN:} pytest -ra {posargs:testing/test_unittest.py}
|
||||
{env:_PYTEST_TOX_COVERAGE_RUN:} pytest {posargs:testing/test_unittest.py}
|
||||
|
||||
[testenv:py36-trial]
|
||||
deps = {[testenv:py27-trial]deps}
|
||||
|
@ -119,7 +112,7 @@ deps =
|
|||
numpy
|
||||
{env:_PYTEST_TOX_EXTRA_DEP:}
|
||||
commands=
|
||||
{env:_PYTEST_TOX_COVERAGE_RUN:} pytest -ra {posargs:testing/python/approx.py}
|
||||
{env:_PYTEST_TOX_COVERAGE_RUN:} pytest {posargs:testing/python/approx.py}
|
||||
|
||||
[testenv:py36-numpy]
|
||||
deps = {[testenv:py27-numpy]deps}
|
||||
|
@ -154,7 +147,7 @@ deps =
|
|||
PyYAML
|
||||
{env:_PYTEST_TOX_EXTRA_DEP:}
|
||||
commands =
|
||||
{env:_PYTEST_TOX_COVERAGE_RUN:} pytest -ra doc/en
|
||||
{env:_PYTEST_TOX_COVERAGE_RUN:} pytest doc/en
|
||||
{env:_PYTEST_TOX_COVERAGE_RUN:} pytest --doctest-modules --pyargs _pytest
|
||||
|
||||
[testenv:regen]
|
||||
|
@ -175,7 +168,7 @@ commands =
|
|||
[testenv:jython]
|
||||
changedir = testing
|
||||
commands =
|
||||
{envpython} {envbindir}/py.test-jython -ra {posargs}
|
||||
{envpython} {envbindir}/py.test-jython {posargs}
|
||||
|
||||
[testenv:py36-freeze]
|
||||
changedir = testing/freeze
|
||||
|
|
Loading…
Reference in New Issue