⬆️ ⬇️

Blocking duplicate symfony сommand

image



Today I want to bring to your attention a special case for solving the "inconveniences" associated with the periodic launch of processes in the event that the previous one has not yet been completed. In other words, blocking running processes in symfony / console . But everything would be too banal if it were not for the need to block among the group of servers, and not on a single one.



Given: The same process that runs on N servers.

Task: Make it so that only one is launched at a time.



The most popular solutions that can be found in the "open spaces":

')

  1. lock through database;
  2. third-party applications;
  3. native use of lock file


The main disadvantages of each of them are:



Database





Third-party applications (for example, run-one for Ubuntu)





Native lock files





The most common, of course, is the 3rd option, but it creates a lot of inconvenience in the presence of a large number of servers and processes. So I decided to share the idea of ​​writing a singleton- command based on symfony / console . But the idea can be used in any other framework.



So, the first thing we had to give up was a flock , which is used, for example, in LockHandler by symfony . It does not allow locking among multiple servers.



Instead, we will create a lock-file in a directory shared between servers, using a small service , this is practically an analogue of LockHandler , but with a “sawed” flock .



The next thing you need to get rid of is the need for each team to manually check the lock, and, most importantly, remove it, because the script does not always end where we expect.



To do this, I propose to apply something similar to Mediator - to implement and finalize the standard execute () method, which will be launched at the start of the command and impose the use of the new lockExecute () method.



What is it for:





In summary, the standard symfony command:



class CreateUserCommand extends Command { protected function configure() { // ... } protected function execute(InputInterface $input, OutputInterface $output) { // ... } } 


would look like this:



 class CreateUserCommand extends SingletonCommand implements SingletonCommandInterface { protected function configure() { // ... } public function lockExecute(InputInterface $input, OutputInterface $output) { // ... } } 


It is not necessary to write much more code and at the same time it will be guaranteed to be launched only 1 time, no matter how many servers try to do it. The only condition is the general directory for lock-files.



Already a ready-made solution and more details can be viewed on the githab: singleton-command



UPD: as was rightly observed - in the case of "hard" crashes of scripts, it is possible to save lock-files. Therefore, it is advisable to organize a demon that will “watch” for “stale” lock-files.



Thanks for attention!

Source: https://habr.com/ru/post/317258/



All Articles