Hello, lovers of Habr! Today I decided to share my version of data backup from MySql and talk about how it can be used to control versions in Git. And if you are interested in finding out how to monitor the state of the database at all stages of development, or just make the correct backups of your project database and deploy it at any time, then I ask you to read it!
This is a set of scripts written in BASH, which allows them to work on almost any machine in which this shell works, designed to facilitate the creation and deployment of backups. The original idea was to create control points for the database when the project was written by a team of developers, and to keep it in the game, I know that there are more serious things for such a purpose, and this solution does not pretend to be their place.
For example, you prefer to develop the site immediately on the customer's hosting and monitor the development progress and changes in the database. Either you have little money (or a toad strangles) to spend it on good products to control database versions. You can also use the project as a data backup based on certain rules, which can be used for the crown. And of course it will come in handy if you are a beginner developer and are just starting to master the basics of development, and you have the 500th periodically and you don’t know why. Either you are developing a product with a team and you want to automatically synchronize it with the production to assess the customer when pushing into the master.
Consider an example of a standard site development on the hosting side (most cases):
In order to comply with the control version of the database using git, obviously you need to get its dumps at some stages, where to store them, and when switching branches take into account this point. For this, I used the hooks of the gita, which are the files of the corresponding scripts (they must be installed on the local computer where git is used). Depending on the configuration file settings, the workflow may look like this:
We create a branch (automatically a backup has occurred) and switch, we work, we add files, we create a commit (a backup has automatically occurred) ...
switched to the master Verka, the database was switched to the previous state ...
returned to development, merged branches, started. Those. backups are automatically created at commit
or forced before the checkout, the behavior is configured in the config. You can manually call export or import the database on the server, from your local computer, by running the corresponding script.
Each script can get help in the classical way through the argument -h or --help.
I do not recommend backing up the entire database, git does not like large files, and in most cases it is not necessary. Therefore, you can easily configure using config.ini
Since the settings are used both on the server side (on which mySql is raised) and on the client side (the developer’s computer), the same file is responsible for the settings. And of course it can be the same computer, if you are developing locally.
In order to facilitate the process of creating dumps. I used the file providers. And set up (while only one) for CMS MODX revolution. Based on it, you can write the same provider for any CMS.
, .git, , git [git clone ](https://github.com/Setest/.git-db-watcher) chmod +x install.sh; ./install.sh , ./install.sh -nh
Now you need to make changes to the config.ini
. For example, like this:
; [hooks] ; H_CHECK_DB_HASH_BEFORE_CHECKOUT=1 ; checkout ; ; git checkout -b new_branch_name H_CHECKOUT_FORCE=0 ; H_CHECKOUT_EVERCOM=1 ; H_CHECKOUT_CLEARCACHE=1 [common] ; EXPORT_FILE="db.sql" ; ; ./export.sh [develop] ; db_export.sh CLI_DB_EXPORT="ssh host '/path/to/project/on/server/.git-db-watcher/db_export.sh'" CLI_DB_IMPORT="ssh host '/path/to/project/on/server/.git-db-watcher/db_import.sh'" ; [server] PHP_PATH="/usr/local/bin/php" CONFIG_INC_PATH="/path/to/project/on/server/core/config/config.inc.php" PROVIDER=modx DB_TABLES_INCLUDE=site_content DB_TABLES_AUTOPREFIX=1 [server_full_site] PHP_PATH="/usr/local/bin/php" CONFIG_INC_PATH="/path/to/project/on/server/core/config/config.inc.php" ; '' - DB_CONFIG_ ; providers PROVIDER=modx ; DB_CONFIG_HOST= DB_CONFIG_TYPE= DB_CONFIG_USER= DB_CONFIG_PASSWORD= DB_CONFIG_CONNECTION_CHARSET= DB_CONFIG_DBASE= DB_CONFIG_TABLE_PREFIX= DB_CONFIG_DATABASE_DSN= ; ( ) ; ; DB_TABLES_INCLUDE=manager_log register_messages user_attributes ; DB_TABLES_INCLUDE=site_content ; ; DB_TABLES_EXCLUDE=session register_messages mse2_words ec_messages ; , , DB_TABLES_AUTOPREFIX=1 ; INSERT DB_TABLES_REMOVE_INSERT="manager_log session register_messages" ; DB_TABLES_REMOVE_INSERT="manager_log" ; ; DB_TABLES_DEFAULT=user_attributes users DB_TABLES_DEFAULT=user_attributes ; , ; , ; DB_TABLES_DEFAULT_user_attributes=sessionid logincount lastlogin thislogin ; DB_TABLES_DEFAULT_users=session_stale ; , [only_users] DB_TABLES_INCLUDE=user user_attributes EXPORT_FILE="users.sql" DB_TABLES_DEFAULT=user_attributes user DB_TABLES_DEFAULT_user_attributes=sessionid logincount lastlogin thislogin DB_TABLES_DEFAULT_users=session_stale
If everything is configured correctly, you can run ./export.sh
and you should
there will be a dump of the database on the local computer and on the server.
I need to save the result on the server to another location:
./db_export.sh --output 1>./xxx.sql
I want to export to the server using the data in my section of the configuration file:
./db_export.sh -=only_users --output 1>./users.sql
I want to import a database file, but I don’t want to do it through GIT interceptors?
./import.sh ./import.sh EXPORT_FILE=site_name.sql ./import.sh DB_BACKUP_FILE=/.../../site_name.sql ./import.sh --config=site DB_BACKUP_FILE=./site_name.sql
How to import while being on the server?
./db_import.sh < db_backup/db.sql
In different projects I use CMS xxx and I am tired of entering data every time
for database management, how can I simplify the process?
To do this, you need to write your provider file by analogy with the existing ones.
I created the CRON job and configuration using the php provider, but it
is not executed, or the CMS site cache is not cleared, what could be wrong?
Depending on the server settings and the task itself, the CRON tasks can run in a completely different environment, in which the path to the php preprocessor may differ and, as a result, run a completely different version of php that is not compatible with the one running your CMS.
I have never written BASH scripts and therefore probably head code I am sure there are literate guys here who, if interested, will be able to add their edits. I will develop the project to the extent of incoming interest and the identification of errors in the work.
And you should not immediately stink that nothing works, perhaps you could not figure out how to make the proper configuration and installation (especially if you are working on Windows, yet BASH is a Linux environment).
Instructions for installation and use is in the readme. I tried to immediately write in English, but also because of my amateur level, maybe not everything is clear, in the future I will write in Russian. If you want to make changes to the translation or code, fork your health! And if there are good tips - share.
PS: if you read it to the bottom, then it became interesting for you and can't wait to try :-)
Source: https://habr.com/ru/post/449868/
All Articles