⬆️ ⬇️

Backup scripts to the Google Cloud Platform (GCP) cloud in five minutes

It has long been the guys ask questions, how to organize a backup to the cloud, while not spending a lot of effort, time and best of all for free. The reason for writing this article was another dialogue on this topic. Following the results of a long correspondence with my colleagues, explanations of the theory and searching for practical implementation options, I decided to write about it. Everything is just like five fingers, but often asked HOW.



Backup to the cloud - the topic is no longer new. Everyone chooses their own cloud provider, their own copying tools, etc. There are many vendors, here we will look at the Google Cloud Platform. We all implement the simplest scripts, without buying software, buying disk storages or anything else.



What we have



We will not go into the intricacies of infrastructure, to each his own. As a result, there is a backup server, onto which copies of file storage and data from machines are merged, packed into archives and dropped into two disk arrays. As a result, a decision was made to remove one copy outside the perimeter.

')

Two types of copying



Yes, we needed to apply two different data backup schemes:



  1. Replicate an existing set of copies on the server
  2. Storing copies for a long time only in the cloud


Let's start with the cloud.



Cloud preparation



Create a trial for $ 300 in the Google Cloud Platform ($ 300 is given for a year, enough for a long time).



After connecting the trial, the console will be available to us. In the menu, go to the Storage section.



image



We will be asked to create a project, create, call as we please. After creating the project, create a Segment in the Storage section, this will be our repository for copies. Call it as convenient, for an article I called backwin for replication and separately created backupwin for copying and storing by day. The vault class will be the cheapest used for Coldline backups . The cost of a gigabyte in it is $ 0.007 per month.



image

Machine preparation



Everything is simple with the server. Go to the Google Cloud documentation section, open the Cloud SDK section, and follow the instructions. In my case, there was a machine with Windows Server, so we download, install. Enough standard settings for installation, because further on, next, ready .



Open the command line, write
gcloud init 
we will be asked to log in to the browser window. Enter the login password from Google Cloud. Further it will be offered to select the project in a window of a command line, we select created earlier. When asked whether to enable the API - Yes, whether we want to manage the Compute Engine - no.



Replication repository



In a nutshell, why we needed it. There is a machine that has a set of backups in a specific directory ( c: \ bak \ ). These are encrypted archives and need to be stored somewhere outside. No problem. Open the command line, write:



 gsutil -m rsync -r -d -e -C file://c:\bak gs://bakwin 




It is worth making a reservation that we experimented with the Windows machine, but it works the same way on Linux, only the path to the directory needs to be fixed.



The command is completed, everything flew into the cloud. We save as a script, turn on the scheduler. Everything! Really five minutes. A little more tuning and the script will be sharpened for a specific task and error handling.



Backup Directory



In this case, we needed to store the data for each day in a separate directory in Google Cloud Storage. It also turned out to be simple, take a sandwich and pour coffee.



For fans of PowerShell, I did it on it. machine on windows server. Modules we have installed in the system with the Cloud SDK. Therefore, to start, except for the Import-Module GoogleCloud we do not need anything.



We show where we have a catalog for copying and in which segment to put it:



 $folder = "C:\Bak" $bucket = "gs:\backupwin" 


Here you can add the creation of a directory for the current date of copying:



 $date = Get-date -format dd.MM.yyyy $bucket = $bucket + "\" + $date mkdir $bucket 


Actually the script itself to copy:



 cd $folder $files = Get-ChildItem -Recurse -Attributes !Directory $data = @() foreach ($file in $files) { $objectPath = $file | Resolve-Path -Relative $data += @{file = $file; objectPath = $objectPath} # } cd $bucket foreach($element in $data) { Write-Host $element.objectPath New-Item -ItemType File -Path $element.objectPath } 


Checking works. We make in a script, we put in the scheduler. That's all love.



At the cost of storing 10 TB of data (in the cloud storage) payment will be from $ 70 per month. In general, everything works. Tuning scripts under specific conditions was not applied.



In general, backup to Google Cloud Storage can also be used with software such as Cloudberry, Veritas, etc., and use cloud storage as additional space for backups. In the case of iron, most vendors already at the storage level support their reservation in the Google Cloud.



Conclusion: cheap, fast, reliable, and the transfer from a trial version to a commercial version occurs without any reconfiguration and bank cards.

Source: https://habr.com/ru/post/332474/



All Articles