Hi, Habrahabr!
In this post I will talk about the layout of Python projects: how to put the code and all the required third-party modules on the server. Many of us were faced with the problem of sweeping a project on a combat vehicle, but little is written about this in the Habré; I want to share my experience.

')
Problem
Everyone starts a project differently; for completeness, I will briefly describe what I use.
- pip and virtualenv for local package installation
- uwsgi layer between nginx and web application
- supervisor to run the necessary processes: uwsgi, celery, tornado, etc.
- nginx runs as root and looks into the distance
In our projects we often use modules that are to some extent written in C and require compilation. This is where the inconvenience begins: you need to install a compiler and a bunch of dev packages on the server in order to compile the dependencies of your project (such as simplejson or lxml). The first thing that comes to mind is to take and compile locally, and then just copy to the server. What's wrong with that? Both there and there x86_64. But as it turned out, binary compatibility is a shaky concept in Linux systems. I certainly guess why this is so, but weak to talk about it. In short, what you compile on Ubuntu will not necessarily work on Debian. If you have a bunch of servers, you can select one to build packages or pick up an identical virtual machine locally and compile there.
When laying out a project on a combat server, there are two options: compile the code on the server or build packages yourself. Personally, this meager choice upsets me. This begs the question, why the authors of the modules or PyPi do not build packages for different platforms, are there binary eggs? They did this, and you can even install them using easy_install, but this attempt failed because there are no guarantees that these binaries will work on your server even with the same version of Python and architecture. And probably not all packages requiring compilation on PyPi have binary builds. By the way, pip does not support binary eggs for this very reason. And compiling the code on the server is some kind of a painful approach, especially if you have a lot of servers to display, and even working behind a laptop, this process just comes out. Everything should be as simple as that: just downloaded and installed.
Attempt # 1
When I took up this issue for the first time, I went all bad. I raised my Debian repository and compiled the project into native deb packages. Everything was completely automated: I wrote scripts to automatically launch the project; I wrote a script on fabric to build deb-packages and display the project to the server. The project was divided into three different deb-packages: project code, virtualenv with all modules, configurations (dev, prod). Installing and running on the server was to execute one command:
sudo apt-get install my-project-venv my-project-dev-conf my-project
This is all great, of course, but it was really annoying that when updating one module to virtualenv, you had to compile all the modules in order to build the deb package. There was also a lot of dancing with a tambourine to make the intolerable virtualenv portable: rewrite the path in the string! #Python in all files in the bin folder, delete all pyc files, delete all links and install all packages in the src folder in the site-packages folder . In order to work on this colossus on the calculation, I had to spend a lot of time and now somehow it is not obvious that I made my life easier.
Attempt # 2
The second time, I decided that it is better to assemble all the necessary modules into separate binary packages. I stumbled upon a relatively new project called wheel and decided to try.
Wheel is an alternative to binary eggs; The author tries to do everything according to the latest trends, and not so long ago his
PEP was adopted. Of the differences, it is noteworthy that wheel is the installation format, not the imported one. Another wheel is a good helper when working on a local machine: you can download and compile all frequently used packages into one folder, then install new packages from there for O (1) when creating a new virtual environment.
Now for the calculation of the project I do the following:
- I picked up my package index: there are quite a few projects on github that allow you to raise your private pypi, I use the localshop , because it has the ability to restrict download access.
- I collect all dependencies (from requires.txt) in the wheel format and fill it in with my index: for this, I had to add a few localshop, as it did not support this format.
- In order not to install git on the server, I also pack the project and put it in my package index.
On the server side, you just have to install and run. It looks like this:
virtualenv myproject . myproject/bin/activate
Run under uwsgi:
pip install --use-wheel uwsgi uwsgi --module=myproject.wsgi --home=myproject ....
Start supervisor:
pip install --use-wheel supervisor supervisor -c supervisor.conf -j supervisor.pid
I write in Python 2.7.3, which is the default on my Ubuntu, and the servers are Debian and Python version 2.6. Of course there is a difference between them, for example: formatting strings using format is not in Python 2.6. Trying to put 2.7.3 from the distribution is not the best idea, it is easier to compile Python and in this we are a good helper - the
pythonbrew project.
Laziness is the engine of progress! We are all lazy programmers (especially those who write in Python and Ruby) and, faced with inconveniences, we want to make our life easier. And how do you display?
Links
lucumr.pocoo.org/2012/6/22/hate-hate-hate-everywheregithub.com/pypa/pip/issues/492hynek.me/articles/python-app-deployment-with-native-packagescrate.io/packages/wheelwww.python.org/dev/peps/pep-0427bitbucket.org/dholth/wheelwheel.readthedocs.org/en/latest/story.htmlgithub.com/mvantellingen/localshopgithub.com/utahta/pythonbrew