📜 ⬆️ ⬇️

Backing up web projects on Yandex.Disk without OOP and models

The day before yesterday, the habrauser vasiatka in post No. 206752 shared with the habrasocommunity, and therefore with the rest of the world, a thoughtful and highly developed class for working with Yandex.Disk. Some time ago, I also began to use this service to store backups there. I want to share a much shorter version of the script in php, which makes archives of the database and site files, and uploads them via WebDAV. Maybe someone and he will like.

If someone wants to immediately take a look at the full listing - he is downstairs .

Description


The script that I offer to your attention is quite concise. If you exclude the setting of parameters and report output, then there will be something around thirty lines . And although the parameters and their names speak for themselves, I still describe them with comments.

What clouds does this script work with?


Since the files are downloaded using the WebDAV protocol, the script should be able to work with any kind of repositories.
I tested only Yandex and Mail.Ru, the download is successful, but the download folder must be created on the cloud in advance. As of 2013.12.22, Mail.Ru’s support for this protocol is not advertised anywhere, and is claimed only as operating in test mode. Use with caution.
')

Settings


Login and password and path on the cloud

The login is specified in full, i.e. with a dog and domain. The password is specified as any self-respecting password is entirely.
$WebDAV = [ 'login'=>'', 'password'=>'', 'url'=>'https://webdav.yandex.ru/backups/sites/',//  'url'=>'https://webdav.cloud.mail.ru/backups/sites/',// Mail.Ru ]; 
The path to the cloud and the path to the target folder must be specified entirely. The folder where you plan to add your precious archives should be created independently in advance. I checked, and it is for sure that she will not be able to appear automatically. If you do not, the download will still go. In the case of Yandex, judging by the weight of the result, the downloaded files are simply glued together into one large one. In the case of Mail.Ru nothing appears. Mandatory closing slash.

Local path to create archives

Although archiving and uploading to a service can be done in one line, in my script, before uploading a file, you need to create it somewhere. The path for creating archives is set in this variable.
 $backupPath 

You can place a repository of local copies inside the site, and add the path to the list of excluded from archiving paths.

List of databases

Write the databases you want to archive into the $ databases array. It is probably worth creating a user in the database without the rights to modify and write for such use.
 $databases = [['login' => '', 'password' => '', 'dbname' => '']]; 


Site List

The standard view of one of the members of the $ sites array is the list of sites being archived.
  'name' => 'site1.ru', 'path' => '/var/www/site1.ru', 'exclude' => [] 

The first parameter is the site name. Just a string that will be assigned to the archive name of your site.
The second is the path to the site files.
The third is a list of directories that should be excluded from archiving. I have not provided for the exclusion of individual files, but you can easily add it yourself. If you create archives inside one of your sites, you can exclude this storage.
Example:
  'exclude' => [ $backupPath, //   ,        '/var/www/site2.ru/temp' // -  ] 


At this, the configuration of the script ends, and he is ready to work for the benefit of the revolution ... society ... webadmin.

If the database is small and the files do not weigh too much, then you can call the script in crown at least every hour. Files with the same name will be overwritten both in the local storage and on Yandex.Disk. On the next day, a file with a new date and something like this will be created:
site1.ru.2013-12-20.zip
If you want to give the archives a unique name every time, you should replace the line
 $date = date('Ym-d'); 
on
 $date = date('YmdHi-s'); 
and then the date will be with the time up to seconds.

Listen and don't say you haven't heard


I want to warn you that this script does not claim to be a reliable method for backing up frequently changing sites, sites with a heavy database, or other adult and serious web projects. But for sites, business cards - the most it.
Important note: your username and password will be visible in the list of running processes during the download.

Actually everything


Thank you for paying attention and reading this post. I hope my work for someone will make life easier, peace of mind, and maybe even save in a difficult moment. Good luck!

UPD: Thanks to lolipop for mentioning Mail.Ru. Changed the script so that it worked with their terabyte storage. Now will be what to score! And although during the search I first met the information that they don’t have WebDAV support, afterwards I found the mention of the necessary address here . So far, only in test mode. It is a pity that it turned out just now - my topic has already sailed from the list of new ones. And those who have already read it, probably will not know about it. And in the light of the action with a terabyte of space is very important.

UPD: Thanks to vk2 for the correct remark that your password and login will be visible on the server in the process list.

UPD: Thanks kosenka , indicated that the closing slash is required in the way of WebDAV. Indeed, this is a folder. I missed it, but now I fixed it both in the example and in the script.
He also suggested that if curl swears on a certificate, it needs to specify the –k key, which will allow it to connect to sites without or with the wrong certificate.
He also rightly noted that zip is not on every hosting. I will try to add in the near future a line for kosher gzip.

Full listing
 $WebDAV = [ 'login'=>'', 'password'=>'', //url         ,        //'url'=>'https://webdav.yandex.ru/backups/sites/',//     ,      . 'url'=>'https://webdav.cloud.mail.ru/backups/sites/',//     ,     . ]; $backupPath = 'path to backups'; //,      ,     WebDAV $databases = [['login' => '', 'password' => '', 'dbname' => '']]; $sites = [ //  : [ 'name' => 'site1.ru', 'path' => '/var/www/site1.ru', 'exclude' => [] ], //     ['name' => 'site2.ru', 'path' => '.', //   'path' => '.'   ,     . 'exclude' => [ //   - ,     .      ,       . $backupPath, //   ,        '/var/www/site2.ru/temp' // -  ] ], ]; //     . ///////////////////////////////////////////////////////////////////////////////////////// //       ,     ,    .  . //.     ,       . //      . $date = date('Ym-d'); $errors = []; $success = []; $files_to_send = []; foreach ($databases as $db) { $filename = "$backupPath/bases/{$db['dbname']}.$date.sql.gz"; $output = `mysqldump --user={$db['login']} --password={$db['password']} {$db['dbname']} | gzip -f > $filename`; if (!file_exists($filename)) { $errors[] = 'Dump ' . $db['dbname'] . ' failed: ' . $output; } else { $success[] = 'DB ' . $db['dbname'] . ' dumped'; $files_to_send[] = $filename; } } foreach ($sites as $site) { $filename = "$backupPath/files/{$site['name']}.$date.zip"; $exclude = ''; if ($site['exclude']) { $exclude = '-x ' . implode('\* -x ', $site['exclude']) . '\*'; } $cmd = "zip -r \"$filename\" {$site['path']} $exclude"; echo $cmd . "<br>\n"; $output = `$cmd`; if (!file_exists($filename)) { $errors[] = 'Site backup ' . $site['name'] . ' failed: ' . $output; } else { $success[] = 'Site ' . $site['name'] . ' saved'; $files_to_send[] = $filename; } } foreach ($errors as $e) { echo ': ' . $e . "<br>\n"; } echo "<br>\n"; foreach ($success as $s) { echo ': ' . $s . "<br>\n"; } echo "<br>\n"; echo "   :<br>\n"; foreach ($files_to_send as $f) { echo $f . "<br>\n"; } echo "<br>\n"; if (!empty($files_to_send)) { foreach ($files_to_send as $file) { echo shell_exec("curl --user {$WebDAV['login']}:{$WebDAV['password']} -T \"$file\" {$WebDAV['url']}") . "<br>\n";//   ,    -k } } 

By the beginning of the post ↑

Source: https://habr.com/ru/post/206898/


All Articles