When we were children, we didn’t understand much, which at the moment is absolutely obvious. "Why can't I eat some sweets? Why are sandwiches not food, I'm eating this? Why can't I sleep with a hamster? Why brush my teeth every day, and even two times ?!" Indeed, why? But as adults, mature people, we do not ask ourselves such questions.
A similar situation occurs in the IT world (perhaps this concerns any professional activity). Many things in the initial stages, we can not understand. "Why, and in general, who needs all sorts of symfony, if there is a WordPress? Why waste time on automated testing, if everything works well? Git? Linux? SOLID, DRY, DDD, TDD? What kind of scary words?". For beginners, it's hard to understand why so much is needed. Awareness comes with time and soon such things become not only clear, but also obvious.
I want to tell my story of how I grew to Docker and what contributed to this.
Disclaimer
This article is just a narrative, in which there can be both objective and subjective thoughts. If I am wrong in something or are seriously mistaken, please tell me about it. I love sensible, adequate criticism and am ready to accept constructive comments.
Docker is quite a simple thing. But have you ever tried to explain to a mere mortal what it is? I've tried. Did not work out. I myself did not understand for a long time why this thing is needed. I heard about him, tried to penetrate, but every time I threw it. Apparently, it was still early - not ripe. But at some point there was a "Eureka"! It was a long way, which I would like to tell others. Perhaps this article will help you to shorten your path and start using this wonderful tool as early as possible.
When I started working as a web developer, I got into one very small office. My experience was 0, we developed simple websites on our own, as we called it, "CMS". The word OpenSource was unfamiliar to me, all the functions were written by hands, there was no architecture. Holes in security were like black holes. If sites like "Blog", "The easiest online store" or "News Feed" were still somehow realizable, then things were a little more difficult to cause wild pain and suffering. We worked through FTP, which caused serious problems when two people worked on the same file. If someone broke something or accidentally wiped it, it would disappear forever.
And here the customer came to us, part-time my good friend, with one outstanding project against the background of all our other projects. We got down to work, but soon the office closed and I began to work directly with this project alone. Then I had the opportunity to do everything humanly ...
At the second stage, several decisions were taken, the order of which I don’t remember. Most likely they occurred simultaneously.
First, I began to store all code in a private Git repository on Bitbucket. This is probably the first thing that any developer should learn. This increased the security of the code from outside interference. VPS could break, it could be hacked, just did not have time to pay. I think the thought is clear. In addition, I closed FTP and deployed via Git Push Webhook from the release or test branch.
Secondly, instead of one site, two almost identical ones were launched. The first is a release that served real customers. The second - the test, which was password-protected, had all the latest unstable features. The latter is needed in order not to roll out in production what is not tested and may not work correctly. Although these two sites worked on the same host, they were independent.
Thirdly, and most importantly, the development began to be carried out locally. The LAMP stack was installed on the laptop, the whole thing was set up, the project was launched locally on the .dev
subdomain (UPD This domain turned out to be busy and should not be used. Thank you alexkbs for comment ). This greatly accelerated the development, allowed to work offline, the fear of breaking something disappeared and it became a little easier to live.
But suddenly I messed up and had to reinstall Ubuntu. You have to install and configure the whole stack again ... This business could be automated with the bash script, but there will be a hard link to the system. I may work alone, but the authorities have plans to hire a second, so I would like to simplify life for him and for himself. After several such jambs, I still found a solution.
Vagrant - simply put, is a tool that gives you the ability to customize VirtualBox with a script. That is, if you write the correct config, then with a single vagrant up
command you can load a ready Debian image, install the necessary software there, configure it and get a ready test environment.
Well, is it not a miracle? Actually a miracle! This markedly simplified the process of deploying the development environment, cleared the main system of unnecessary packages, and, thanks to virtualization, I was able to set up the machine as close as possible to the hosting settings.
Virtualization is great for development. But in production it is no good. This is a fierce overtake for the system. In addition, this method does not solve several problems:
Yes, and superfluous several gigabytes of RAM never happens.
And now take all the problems from all the stages above and imagine that there is a tool that will simply and elegantly solve them for you.
What is a docker? There are many comparisons with containers that are used to transport goods on the Internet. It's right.
But imagine a big hall and a lot of people who are working on something. Not necessarily that they do the same thing. Accountants, lawyers, carpenters, painters, analysts, programmers. Yes, they sometimes interact. But due to the fact that they do not have a personal working space, such work is more like chaos. The trouble with the documents, someone has someone pulled the handle, someone sat on someone else's chair, touched not his computer and p. But if you put partitions, to allocate a personal workplace for each person or group of people, then it becomes easier to live. Everyone does his own job, uses his own resources, does not bother anyone. The interaction takes place through dedicated channels - mail, archives or doors.
Also Docker. The big hall is your system (in fact, everything is a bit more complicated, but for a general understanding of the concepts it can be simplified). Docker can create isolated space for a process or group of processes that do their work. The interaction takes place through open ports, shared files, internal and external networks. No one bothers anyone. Thus, you can have at least 20 versions of PHP or something else on one host. And thanks to the images (images) you can, figuratively speaking, order a ready office with all the buns.
Connecting to the project of new components has become a matter of a few lines thanks to docker-compose. Here are some examples:
# PHPMyAdmin pma: image: phpmyadmin/phpmyadmin:4.7 restart: always environment: MYSQL_ROOT_PASSWORD: ${DB_PASSWORD} depends_on: - db ports: - "181:80"
# MySQL db: image: mysql:5.7.20 restart: always environment: MYSQL_DATABASE: ${DB_NAME} MYSQL_ROOT_PASSWORD: ${DB_PASSWORD} volumes: - db-data:/var/lib/mysql
Need to update MySQL version? How much time and effort would it take to do it on a regular VPS hosting? Here it is a question of 3-4 characters.
Due to the fact that Docker works inside the working system, and does not create a new one, it does not heavily load the machine for development and is quite suitable for sales. And this is a big plus, because the code running on a Ubuntu developer machine will work exactly the same as on a Debian server.
Another benefit of this solution is that automated testing has become much easier. Verification occurs inside insulated containers that are created before each launch. Therefore, tests are stable and run or not always run.
Docker is not easy to understand, despite its simplicity. For this you need to grow. I did not have a reliable guide, so I had to comprehend everything myself. Perhaps someone like me will be able to grow a little faster thanks to this article.
I want to note one thought. Docker implementation can be redundant at times. If you have a personal blog, a one-page or some other simple project, then the server setting will eat your time and resources, and there will be no profit on the exhaust, or it will be negative. But if this is a large and long-term project, before Docker is likely to be the right decision. These are like automatic tests. For small projects they are not needed, while for large projects they are required.
And yet, I do not want anyone to teach bad. If I have provided any false information, please correct me. If you make a mistake, help me fix it.
Source: https://habr.com/ru/post/343572/
All Articles