Enough powerful pytest out of the box, it becomes even better when you add a mix of plug-ins to it. The pytest codebase is structured with settings and extensions, and there are hooks available for modification and improvements via plugins.
The examples in this book are written using Python 3.6 and pytest 3.2. pytest 3.2 supports Python 2.6, 2.7 and Python 3.3+.
The source code for the Tasks project, as well as for all the tests shown in this book, is available via the link on the book's web page at pragprog.com . You do not need to download the source code to understand the test code; The test code is presented in a convenient form in the examples. But to follow along with the project objectives, or to adapt test examples to test your own project (your hands are untied!), You should go to the book’s web page and download the work. In the same place, on the book’s web page there is a link for errata messages and a discussion forum .
Under the spoiler is a list of articles in this series.
You may be surprised to learn that you have already written some kind of plug-ins, if you have worked through the previous chapters in this book. Every time you place fixtures and / or conftest.py
top-level conftest.py
file of the project, you create a local conftest
plugin. This is just a little extra work on converting these conftest.py
files into installable plugins that you can share between projects, with other people, or with the world.
We will begin this chapter with an answer to the question of where to look for third-party plugins. Quite a lot of plugins are available, so there is a decent chance that someone has already written the changes you want to make in pytest
. Since we will consider open source plugins, if a plug-in does almost what you want to do, but not quite, you can develop it, or use it as a reference to create your own plug-in. Although this chapter is about creating your own plugins, Appendix 3, the Sampler Pack plugin, on page 163 is included to give you a taste of what is possible.
In this chapter, you will learn how to create plugins, and I will point you in the right direction for testing, packaging, and distributing them. The complete topic of packaging and distribution of Python is too extensive and claims to be its own book, so we will not cover everything. But you will get enough information to be able to share plugins with your team. I will also talk about some simple ways to create plugins with PyPI support and the least amount of work.
You can find third-party pytest plugins in several places. The plugins listed in Appendix 3, Plugin Sampler Pack, on page 163, are available for download from PyPI. However, this is not the only place to look for great pytest plugins.
https://docs.pytest.org/en/latest/plugins.html
On the main pytest documentation site, there is a page that talks about installing and using pytest plug-ins and lists several common plug-ins.
The Python Package Index (PyPI) is a great place to get lots of Python packages, but also a great place to look for pytest plugins. When searching for pytest plugins, just enter “pytest,” “pytest -” or “-pytest” into the search field, since most pytest plugins either start with “pytest -” or end with “-pytest.”
The "pytest-dev" group on GitHub is the place where the pytest source code is stored. In addition, here you can find popular pytest plugins that should be supported in the long run by the pytest kernel command.
Pytest plugins are installed with pip, like other Python packages. But,
You can use pip in several ways to install plugins.
Since PyPI is the default location for pip, installing PyPI plugins is the easiest method. Let's install the pytest-cov
plugin:
$ pip install pytest-cov
The latest stable version from PyPI will be installed.
If you want a specific version of the plugin, you can specify the version after ==
:
$ pip install pytest-cov==2.4.0
PyPI packages are distributed as zip files with the .tar.gz
and / or .whl
. They are often referred to as "tar balls" and "wheels". If you have problems with trying to work with PyPI directly (which can happen with firewalls and other network complications), you can download either .tar.gz
or .whl
and install from this.
You do not need to unpack or dance with a tambourine; just point a pip at it:
$ pip install pytest-cov-2.4.0.tar.gz # or $ pip install pytest_cov-2.4.0-py2.py3-none-any.whl
You can have a plug-in stash (and other Python packages) in a local or shared directory in .tar.gz
or .whl
and use this instead of PyPI to install plug-ins:
$ mkdir some_plugins $ cp pytest_cov-2.4.0-py2.py3-none-any.whl some_plugins/ $ pip install --no-index --find-links=./some_plugins/ pytest-cov
--no-index
tells pip
not to connect to PyPI. --find-links=./some_plugins/
tells pip to look in the some_plugins
directory. This method is especially useful if you have both third-party and own plugins stored locally, as well as if you are creating new virtual environments for continuous integration or with tox. (We’ll talk about both tox and continuous integration in Chapter 7, using pytest with other tools, on page 125.)
Please note that using the local directory installation method you can install several versions and specify which version you want by adding == and the version number:
$ pip install --no-index --find-links=./some_plugins/ pytest-cov==2.4.0
You can install plugins directly from the Git repository in this case GitHub:
$ pip install git+https://github.com/pytest-dev/pytest-cov
You can also specify a version tag:
$ pip install git+https://github.com/pytest-dev/pytest-cov@v2.4.0
Or you can specify a branch:
$ pip install git+https://github.com/pytest-dev/pytest-cov@master
Installing from the Git repository is especially useful if you are storing your own work in Git or if the required version of the plugin or plugin is not in PyPI.
Translator's Note:
pip supports installations from Git, Mercurial, Subversion, and Bazaar and determines the type of VCS using the url prefixes: "git +", "hg +", "svn +", "bzr +".
More details can be found in the PyPI documentation.
Many third-party plugins contain quite a lot of code. This is one of the reasons why we use them, to save us time to develop all this on our own. However, for your specific code, you will undoubtedly come up with special fixtures and modifications that will help you test it. By creating a plugin, you can easily share even a few fixtures that you want to share between several projects. You can share these changes with several projects — and perhaps with the rest of the world — by developing and distributing your own plugins. It is quite easy to do. In this section, we will develop a small modification of the pytest behavior, package it as a plugin, test it and consider how to distribute it.
Plugins can include hook functions that change pytest behavior. Since pytest was designed to allow plugins to slightly change the behavior of pytest, many hook functions are available. hooks for pytest are listed on the pytest documentation site . In our example, we will create a plugin that will change the appearance of the test status. Add a command line parameter to enable this new behavior. Add text to the output header. In particular, we will change all FAILED status indicators (unsuccessful) to “OPPORTUNITY (perspective) for improvement,” change F to O, and add “Thanks for running the tests” to the title. For this we will use the option --nice
.
To keep behavior changes separate from the discussion of plug-in mechanics, we will make changes to conftest.py before we turn it into a distributed plugin. You do not need to run plugins in this way. But often the changes that you intended to use only in one project will be useful enough to share them and turn into a plugin. Therefore, we will start by adding functionality to the conftest.py file, and then, after everything works in conftest.py, we will move the code into the package.
Let's return to the Tasks project. In the section "waiting for exceptions" on page 30 we wrote several tests that checked whether exceptions were thrown if someone incorrectly called the API function. It looks like we missed at least a few possible error conditions.
Here are a couple of tests:
ch5 / a / tasks_proj / tests / func / test_api_exceptions.py
""" API wrong.""" import pytest import tasks from tasks import Task @pytest.mark.usefixtures('tasks_db') class TestAdd(): """, tasks.add().""" def test_missing_summary(self): """ , summary missing.""" with pytest.raises(ValueError): tasks.add(Task(owner='bob')) def test_done_not_bool(self): """ , done bool.""" with pytest.raises(ValueError): tasks.add(Task(summary='summary', done='True'))
Let's run them to check if they pass:
$ cd /path/to/code/ch5/a/tasks_proj $ pytest ===================== test session starts ====================== collected 57 items tests/func/test_add.py ... tests/func/test_add_variety.py ............................ tests/func/test_add_variety2.py ............ tests/func/test_api_exceptions.py .F....... tests/func/test_unique_id.py . tests/unit/test_task.py .... =========================== FAILURES =========================== __________________ TestAdd.test_done_not_bool __________________ self = <func.test_api_exceptions.TestAdd object at 0x103a71a20> def test_done_not_bool(self): """Should raise an exception if done is not a bool.""" with pytest.raises(ValueError): > tasks.add(Task(summary='summary', done='True')) E Failed: DID NOT RAISE <class 'ValueError'> tests/func/test_api_exceptions.py:20: Failed ============= 1 failed, 56 passed in 0.28 seconds ==============
Let's run it again with -v
for details. Since you have already seen the trace, you can disable it by pressing --tb=no
.
Now let's focus on new tests with -k TestAdd
, which works because there are no other tests with names that contain “TestAdd.”
We could quit and try to correct this test (and we will do it later), but now we will focus on trying to make failures more pleasant for developers.
Let's start by adding the message "thank you" to a header that you can do with the pytest hook called pytest_report_header()
.
ch5 / b / tasks_proj / tests / conftest.py
def pytest_report_header(): """ .""" return "Thanks for running the tests."
Obviously, typing a thank you message is pretty silly. However, the ability to add information to the header can be extended. You can add a user name, specify the equipment used and the versions being tested. In general, everything that you can convert to a string can be inserted into the test header.
Then we change the test status report to change F
to O
and FAILED
to OPPORTUNITY for improvement
. There is a hook that allows this affair: pytest_report_teststatus()
:
ch5 / b / tasks_proj / tests / conftest.py
def pytest_report_teststatus(report): """ .""" if report.when == 'call' and report.failed: return (report.outcome, 'O', 'OPPORTUNITY for improvement')
And now we have just the way out we were looking. A test session without the --verbose
flag shows O
for failures, that is, the possibility of improvement:
$ cd /path/to/code/ch5/b/tasks_proj/tests/func $ pytest --tb=no test_api_exceptions.py -k TestAdd ===================== test session starts ====================== Thanks for running the tests. collected 9 items test_api_exceptions.py .O ====================== 7 tests deselected ====================== ======= 1 failed, 1 passed, 7 deselected in 0.06 seconds =======
With the -v
or --verbose
flag will be better:
$ pytest -v --tb=no test_api_exceptions.py -k TestAdd ===================== test session starts ====================== Thanks for running the tests. collected 9 items test_api_exceptions.py::TestAdd::test_missing_summary PASSED test_api_exceptions.py::TestAdd::test_done_not_bool OPPORTUNITY for improvement ====================== 7 tests deselected ====================== ======= 1 failed, 1 passed, 7 deselected in 0.07 seconds =======
The last change we make is to add a command line parameter, --nice,
so that our status changes occur only if we substitute --nice
:
def pytest_addoption(parser): """ nice --nice.""" group = parser.getgroup('nice') group.addoption("--nice", action="store_true", help="nice: turn failures into opportunities") def pytest_report_header(): """ .""" if pytest.config.getoption('nice'): return "Thanks for running the tests." def pytest_report_teststatus(report): """ .""" if report.when == 'call': if report.failed and pytest.config.getoption('nice'): return (report.outcome, 'O', 'OPPORTUNITY for improvement')
It is worth noting that for this plugin we use only a couple of hooks. There are many others that can be found on the main Pytest documentation site .
Now we can manually test our plugin by simply running it in our example. First, without the --nice
option to make sure that only the username is displayed:
$ cd /path/to/code/ch5/c/tasks_proj/tests/func $ pytest --tb=no test_api_exceptions.py -k TestAdd ===================== test session starts ====================== collected 9 items test_api_exceptions.py .F ====================== 7 tests deselected ====================== ======= 1 failed, 1 passed, 7 deselected in 0.07 seconds =======
Now with --nice
:
$ pytest --nice --tb=no test_api_exceptions.py -k TestAdd ===================== test session starts ====================== Thanks for running the tests. collected 9 items test_api_exceptions.py .O ====================== 7 tests deselected ====================== ======= 1 failed, 1 passed, 7 deselected in 0.07 seconds =======
Now with --nice
and --verbose
:
$ pytest -v --nice --tb=no test_api_exceptions.py -k TestAdd ===================== test session starts ====================== Thanks for running the tests. collected 9 items test_api_exceptions.py::TestAdd::test_missing_summary PASSED test_api_exceptions.py::TestAdd::test_done_not_bool OPPORTUNITY for improvement ====================== 7 tests deselected ====================== ======= 1 failed, 1 passed, 7 deselected in 0.06 seconds =======
Fine! All the changes we wanted to make were made in about a dozen lines of the code of the conftest.py
file. Next we will move this code into the plugin structure.
The process of exchanging plugins with other users is clearly defined. Even if you never turn on your own plugin in PyPI, after going through this process, it will be easier for you to read the code from open source plugins, and you will have more opportunities to evaluate whether they will help you or not.
It would be unnecessary to fully cover the packaging and distribution of Python packages in this book, since this topic is well documented elsewhere. Here and here and here in Russian. However, moving from a local configuration plug-in that we created in the previous section to something installed with pip is a simple task. ,
First, we need to create a new directory to host our plugin code. It doesn't matter what you call it, but since we are creating a plugin for the nice flag, let's call it pytest-nice. We will have two files in this new directory: pytest_nice.py and setup.py. (The test catalog will be discussed in the section "Test Plugins" on page 105.)
│ LICENSE │ pytest_nice.py │ setup.py │ └───tests │ conftest.py │ test_nice.py
In pytest_nice.py
, we put the exact contents of our conftest.py, which was associated with this function (and extract it from tasks_proj/tests/conftest.py
):
ch5 / pytest-nice / pytest_nice.py
""" pytest-nice .""" import pytest def pytest_addoption(parser): """ nice --nice.""" group = parser.getgroup('nice') group.addoption("--nice", action="store_true", help="nice: turn FAILED into OPPORTUNITY for improvement") def pytest_report_header(): """ .""" if pytest.config.getoption('nice'): return "Thanks for running the tests." def pytest_report_teststatus(report): """ .""" if report.when == 'call': if report.failed and pytest.config.getoption('nice'): return (report.outcome, 'O', 'OPPORTUNITY for improvement')
In setup.py
we need the maximum minimum setup()
call:
ch5 / pytest-nice / setup.py
"""Setup pytest-nice plugin.""" from setuptools import setup setup( name='pytest-nice', version='0.1.0', description=' Pytest, FAILURE into OPPORTUNITY', url='https:////////', author=' ', author_email='your_email@somewhere.com', license='proprietary', py_modules=['pytest_nice'], install_requires=['pytest'], entry_points={'pytest11': ['nice = pytest_nice', ], }, )
You will need more information in the settings if you are going to distribute it among a wide audience or on the Internet. However, for a small team or just for yourself this will be enough.
You can include some other parameters for setup()
; and here we have only required fields. The version field is a version of this plugin. And it is entirely up to you when you pick up the version. The URL field is required. You can leave it blank, but you will receive a warning. The author
and author_email
can be replaced with the maintainer
and maintainer_email
, but one of these pairs should be there. The license
field is a short text field. It could be one of many open source licenses, your name or company, or something suitable for you. The py_modules
lists pytest_nice
as our only module for this plugin. Although this is a list, and you can include more than one module, if I had more than one, I would use the package and put all the modules in one directory.
Until now, all the setup()
parameters are standard and used for all Python installers. The part that differs for Pytest plugins is the entry_points
parameter. We have listed entry_points={'pytest11': ['nice = pytest_nice', ], },.
The entry_points
function is standard for setuptools
, but pytest11 is a special identifier that pytest is looking for. In this line, we tell pytest that nice
is the name of our plugin, and pytest_nice
name of the module in which our plugin lives. If we used the package, our entry here would be:
I have not talked about the file README.rst
. Some form of README is a setuptools requirement. If you miss it, you will get this:
... warning: sdist: standard file not found: should have one of README, README.rst, README.txt ...
Keeping the README as the standard way to include some project information is a good idea anyway. This is what I put in the pytest-nice file:
ch5 / pytest-nice / README.rst
pytest-nice : A pytest plugin ============================= pytest . -------- - , pytest. - ``--nice`` , : - ``F`` ``O`` - ``-v``, ``FAILURE`` ``OPPORTUNITY for improvement`` ------------ , Pytest .tar.gz PATH, : :: $ pip install PATH/pytest-nice-0.1.0.tar.gz $ pip install --no-index --find-links PATH pytest-nice ----- :: $ pytest --nice
There are many opinions about what should be in the readme file. This is a heavily cropped version, but it works.
Plugins are code that you need to test, just like any other code. However, testing the changes in the testing tool is a bit more complicated. When we developed the plug-in code in “Writing Your Own Plugins”, on page 98, we checked it manually using the sample test file, running pytest with it and checking the output to make sure it was correct. We can do the same thing automatically using a plugin called pytester
, which comes with pytest, but is disabled by default.
There are two files in our test directory for pytest-nice: conftest.py
and test_nice.py
. To use pytester
, we need to add only one line to conftest.py
:
ch5 / pytest-nice / tests / conftest.py
"""pytester is needed for testing plugins.""" pytest_plugins = 'pytester'
This line includes the pytester
plugin. We will use a fixture called testdir
, which becomes available when pytester
enabled.
Often, plug-in tests take the form we described manually:
Let's look at one example:
ch5 / pytest-nice / tests / test_nice.py
def test_pass_fail(testdir): # Pytest testdir.makepyfile(""" def test_pass(): assert 1 == 1 def test_fail(): assert 1 == 2 """) # pytest result = testdir.runpytest() # fnmatch_lines result.stdout.fnmatch_lines([ '*.F', # . Pass, F Fail ]) # , '1' testsuite assert result.ret == 1
The testdir testdir
automatically creates a temporary directory for placing test files. It has a makepyfile()
method that allows you to put the contents of the test file. In this case, we create two tests: one that passes and the other that fails .
We run pytest for the new test file using testdir.runpytest()
. You can pass parameters if you want. The return value can be considered further and is of type RunResult .
I usually look at stdout
and ret
. , , , fnmatch_lines
, , , , ret
0 1 . , fnmatch_lines
, . . , , :
ch5/pytest-nice/tests/test_nice.py
@pytest.fixture() def sample_test(testdir): testdir.makepyfile(""" def test_pass(): assert 1 == 1 def test_fail(): assert 1 == 2 """) return testdir
, , sample_test
, . :
ch5/pytest-nice/tests/test_nice.py
def test_with_nice(sample_test): result = sample_test.runpytest('--nice') result.stdout.fnmatch_lines(['*.O', ]) # . for Pass, O for Fail assert result.ret == 1 def test_with_nice_verbose(sample_test): result = sample_test.runpytest('-v', '--nice') result.stdout.fnmatch_lines([ '*::test_fail OPPORTUNITY for improvement', ]) assert result.ret == 1 def test_not_nice_verbose(sample_test): result = sample_test.runpytest('-v') result.stdout.fnmatch_lines(['*::test_fail FAILED']) assert result.ret == 1
. , :
ch5/pytest-nice/tests/test_nice.py
def test_header(sample_test): result = sample_test.runpytest('--nice') result.stdout.fnmatch_lines(['Thanks for running the tests.']) def test_header_not_nice(sample_test): result = sample_test.runpytest() thanks_message = 'Thanks for running the tests.' assert thanks_message not in result.stdout.str()
, , .
:
ch5/pytest-nice/tests/test_nice.py
def test_help_message(testdir): result = testdir.runpytest('--help') # fnmatch_lines result.stdout.fnmatch_lines([ 'nice:', '*--nice*nice: turn FAILED into OPPORTUNITY for improvement', ])
, , , .
pytest-nice
, . .zip.gz
, :
$ cd /path/to/code/ch5/pytest-nice/ $ pip install . Processing /path/to/code/ch5/pytest-nice Requirement already satisfied: pytest in /path/to/venv/lib/python3.6/site-packages (from pytest-nice==0.1.0) Requirement already satisfied: py>=1.4.33 in /path/to/venv/lib/python3.6/site-packages (from pytest->pytest-nice==0.1.0) Requirement already satisfied: setuptools in /path/to/venv/lib/python3.6/site-packages (from pytest->pytest-nice==0.1.0) Building wheels for collected packages: pytest-nice Running setup.py bdist_wheel for pytest-nice ... done ... Successfully built pytest-nice Installing collected packages: pytest-nice Successfully installed pytest-nice-0.1.0
, , :
$ pytest -v ===================== test session starts ====================== plugins: nice-0.1.0 collected 7 items tests/test_nice.py::test_pass_fail PASSED tests/test_nice.py::test_with_nice PASSED tests/test_nice.py::test_with_nice_verbose PASSED tests/test_nice.py::test_not_nice_verbose PASSED tests/test_nice.py::test_header PASSED tests/test_nice.py::test_header_not_nice PASSED tests/test_nice.py::test_help_message PASSED =================== 7 passed in 0.34 seconds ===================
Note : , . .
platform win32 -- Python 3.6.5, pytest-3.9.3, py-1.7.0, pluggy-0.8.0 -- c:\venv36\scripts\python.exe collected 7 items tests/test_nice.py::test_pass_fail FAILED [ 14%] tests/test_nice.py::test_with_nice OPPORTUNITY for improvement [ 28%] tests/test_nice.py::test_with_nice_verbose OPPORTUNITY for improvement [ 42%] tests/test_nice.py::test_not_nice_verbose FAILED [ 57%] tests/test_nice.py::test_header PASSED [ 71%] tests/test_nice.py::test_header_not_nice PASSED [ 85%] tests/test_nice.py::test_help_message PASSED [100%] ================================== FAILURES =================================== _______________________________ test_pass_fail ________________________________
result.stdout.fnmatch_lines([ '*.F', # . for Pass, F for Fail ])
')
on
result.stdout.fnmatch_lines([ '*.F*', # . for Pass, F for Fail ])
*
F
test_with_nice
,test_with_nice_verbose
,test_not_nice_verbose
pytest.
c
'test_with_nice.py .O [100%]'
.
,
RemovedInPytest4Warning: usage of Session.Class is deprecated, please use pytest.Class instead
!
(venv36) c:\_BOOKS_\pytest_si\bopytest-code\code\ch5\pytest-nice>pytest -v ============================= test session starts ============================= platform win32 -- Python 3.6.5, pytest-3.9.3, py-1.7.0, pluggy-0.8.0 -- c:\venv36\scripts\python.exe cachedir: .pytest_cache rootdir: c:\_BOOKS_\pytest_si\bopytest-code\code\ch5\pytest-nice, inifile: plugins: nice-0.1.0 collected 7 items tests/test_nice.py::test_pass_fail PASSED [ 14%] tests/test_nice.py::test_with_nice PASSED [ 28%] tests/test_nice.py::test_with_nice_verbose PASSED [ 42%] tests/test_nice.py::test_not_nice_verbose PASSED [ 57%] tests/test_nice.py::test_header PASSED [ 71%] tests/test_nice.py::test_header_not_nice PASSED [ 85%] tests/test_nice.py::test_help_message PASSED [100%] ============================== warnings summary =============================== tests/test_nice.py::test_pass_fail c:\venv36\lib\site-packages\_pytest\compat.py:332: RemovedInPytest4Warning: usage of Session.Class is deprecated, please use pytest.Class instead return getattr(object, name, default)
Hooray! . (pytest-nice), Python
pytest-:
$ pip uninstall pytest-nice Uninstalling pytest-nice-0.1.0: Would remove: \path\to\venv\lib\site-packages\pytest_nice-0.1.0.dist-info\* ... Proceed (y/n)? y Successfully uninstalled pytest-nice-0.1.0
— , pytest, PyPI.
, . , setup.py :
$ cd /path/to/code/ch5/pytest-nice $ python setup.py sdist running sdist running egg_info creating pytest_nice.egg-info ... running check creating pytest-nice-0.1.0 ... creating dist Creating tar archive ... $ ls dist pytest-nice-0.1.0.tar.gz
( , sdist source distribution — “ .”)
pytest-nice dist pytest-nice-0.1.0.tar.gz
.
, , :
$ pip install dist/pytest-nice-0.1.0.tar.gz Processing ./dist/pytest-nice-0.1.0.tar.gz ... Installing collected packages: pytest-nice Successfully installed pytest-nice-0.1.0
.tar.gz
, .
pip
, , , , , , .tar.gz
. , pytest-nice-0.1.0.tar.gz
myplugins
.
pytest-nice
myplugins
:
$ pip install --no-index --find-links myplugins pytest-nice
--no-index
pip
PyPI, , .
The --find-links myplugins tells PyPI to look in myplugins for packages to install. And of course, pytest-nice is what we want to install.
--find-links myplugins
PyPI myplugins
. , pytest-nice
— , .
myplugins
, , --upgrade
:
$ pip install --upgrade --no-index --find-links myplugins pytest-nice
pip
, --no-index --find-links myplugins
.
, , . , . , , , Python Packaging .
pytest, — cookiecutter-pytest-plugin
:
$ pip install cookiecutter $ cookiecutter https://github.com/pytest-dev/cookiecutter-pytest-plugin
. , . ; , , . pytest, , .
ch4/cache/test_slower.py
autouse fixture, check_duration()
. 4. .
You have used many times conftest.py
in this book so far . There are also configuration files that affect pytest performance, for example pytest.ini
. In the next chapter, you will learn about the various configuration files and learn what you can do to make your life easier during testing.
Source: https://habr.com/ru/post/448794/