📜 ⬆️ ⬇️

Its build system on Linux

image
Hello! I haven’t appeared here as a speaker for a long time, but this time I decided to share something I did myself, and also to find out if you need it, don’t need it, how can you modify it and generally hear any feedback about my actions.

Motivation


The problem of building and running a project on different machines has always haunted me. In order to realistically simulate the work of the developed site on a local machine, you need to install a Web server, an Application server, perhaps some other intermediate server will join them, install a database, set up a database. In order to install a test site on a test server, you need to do the same work. And later the same with the working server.

It seems that the problem is solved easily - write all the commands in the file and just run it everywhere. The solution is relatively good, but not perfect, and here's why. For example, the necessary packages have already been installed on one of the servers and the database is ready there. But not completely, the recent migrations are not applied to it. We'll have to open the file with the commands and pull out the necessary ones from there in order not to get an error or to break something.

But this is not such a serious problem, I have identified for myself a bigger problem for myself when working with Django. Django, as you know, when it starts, hangs in memory, and if the code is changed, these changes will not affect the site. You have to constantly restart the server. Not difficult. And if the models are changed, you also need to create and apply migrations? And if the settings of the web server are changed, then you need to apply them and restart the web server? And if everything is together, and I opened the project a month ago and absolutely do not remember what I changed there and I would have to “do well”, but I don’t want to tirelessly drive all the teams? And if the project is huge and I do not want to waste time on extra commands at startup and build? And such “And if ...” there can be darkness.
')
The solution came by itself - we need automation, a project builder. Of course, on Linux. Googling, I found a lot of project builders ... For one language or one technology. Nothing really universal to register commands - and he launches them on demand - no. There is a cmake, but I did not take it, because I came up with a better solution)

At this point, the first bicycle scheme was created. At first I wrote all the commands to the file, but with the slightest changes, restarting everything took a long time - it was annoying. First resigned. Then I wanted the script to have settings, wrote variables in the first lines and described an algorithm for changing them through the script launch arguments. Then I still wanted to not execute some commands, if this is not required, and I did the verification functions. Then the idea came to separate the teams, some combine with each other.

I called the combined commands "target". The name of the target is sent to the script, and then it is executed. It turned out that some goals are incapable of being fulfilled without the fulfillment of other goals - this is how a hierarchy appeared. Then the command validation functions were turned into goal verification functions. Then I wanted to simplify the work with the installation of packages, and the “package” entity was created.

In general, I can describe the development process for a long time - it is probably boring.

Result


The final working variant was a bash script of 400 lines, which I called xGod. I called him that because this file has become indispensable for me when working like air.
How xGod works:

Runs from console - bash ./xgod build.xg run
build.xg is the build file, which contains all the goals and additional functions.
run is the goal to be completed

What build.xg consists of:

1. From the usual lines in the bash language - they are executed sequentially as the file is read.
2. Of goals

For example:

target syncdb: virtualenv createmysqluser source "$projectpath/venv/bin/activate" python3 "$projectpath/manage.py" makemigrations python3 "$projectpath/manage.py" migrate deactivate 

syncdb - the name of the target; virtualenv createmysqluser is a goal that must be completed before the syncdb goal is executed, so-called dependencies; all the rest is the usual bash code, with which the goal itself reaches.

3. Packages:

For example:

 package gunicorn: python all: name: python3-gunicorn 

gunicorn is the name of the package (or goal, because for a script it is the same goal); python - dependency; all is the name of the distribution to which the nested settings are applied, all means that these settings apply to all distributions without exception, only debian and ubuntu are currently supported, because I did not work with others; name is the name of the package used for installation.

4. Check functions:

For example:

 check syncdb() # any code return 1 # or return 0 endcheck 

The check function allows you to check whether you need to perform a syncdb target or not. It is saved and executed as a normal function, returns 1 (if the target must be completed) or 0 (if the target does not need to be performed)

An extension support system was also written. The goals of a package are extensions. The syntax of the extensions is not much different from the syntax of the assembly files, it may be present:

1. Normal bash commands
2. Mandatory action function.

For example:

 action # any code with $1 endaction 

This function accepts the name of the target as input and executes it according to its own rules. It can get all the insides of the target from the $ {TARGETS [$ 1]} variable

3. Target checking function

For example:

 check # any code with $1 return 1 # or return 0 endcheck 

It also receives the name of the target as input and checks whether it needs to be executed. If necessary, it is obliged to return 1 , and if not, then 0

More applications


The use of this script may be larger than just building and running projects from the zero state of the machine. For example, I have my own set of packages that I want to see every time I install the system. Every time new distributions change the set of standard packages, so after installation I do not know if there are such packages in the system or not. Of course, I can find out, but I'm too lazy. It is much easier to fill all the necessary packages in the script and one command to start their installation. Those that are already in the system, he will miss, and those that do not - install. It's simple.

In consequence of this application of the script, its main condition was - these are minimal dependencies for launching. Therefore, instead of Python or C ++, it is written in bash - so that it can be run from any Linux environment without additional actions. The only minus is that bash must be at least version 4, since associative arrays are not supported there.

I will leave the link to the code here .

Source: https://habr.com/ru/post/338698/


All Articles