📜 ⬆️ ⬇️

Database migration to Windows Azure SQL VM. BLOB Storage + Azure SDK

In the previous example, we practiced uploading files to Azure Storage using the REST API and uploading the AdventureWorks2012 database backup to it.
It remains to download it to the cloud virtual machine and restore it to the installed SQL Server. In this regard, working with Azure Storage is completely symmetrical that from the on-premise client's side, that from the cloud-based virtual machine - they transfer files to each other via Azure Storage. One there downloads, the second reads.

Since container1 was created as a public container, it is not necessary to generate a digital signature to list and read the blobs it contains:
tststorage.blob.core.windows.net/container1/aaa.txt

If the container was created as private, you need to log in to read blobs from it. As we remember, it’s not enough to transfer the primary or secondary access key to your Azure Storage account. It is required to carefully form a string of a certain format and mix the MD5 hash, as was done in Script 2 of the previous post. Working with Azure Storage through the REST API is convenient in that it does not involve the installation of additional funds, bypassing essentially HTTP Request / Response, but requires laboriousness. In this post, we will use the Azure SDK, which is convenient because it masks the preparatory work within itself and has more human-friendly interfaces for reading / writing blobs, managing containers, etc. It is free and comes with Windows Azure .NET Developer Center. To install the Windows Azure SDK, you will need Visual Studio 2010 SP1 or 2012. The installation process launches Web Platform Installer 4.0, which performs the further installation. The SDK includes Windows Azure Authoring Tools - June 2012 Release, Windows Azure Emulator, Windows Azure Libraries for .NET 1.7, Windows Azure Tools for Microsoft Visual Studio, and LightSwitch. The Cloud Emulator is a handy thing that allows you to create cloud applications locally, so as not to pay extra money for the computation time and place in the Azure storyline during the debugging process. Consists of an emulator environment for running cloud services and a cloud storage emulator for tables, queues and blobs. In the links to the documentation on the REST API, which were cited in the previous post, you probably noticed that along with the indication of the Request URI for the GET, PUT, ... methods there is an Emulated Storage Service URI, as well as phrases such as Note that the storage emulator only supports blob sizes up to 2 GB.
After installing the SDK, it becomes possible using Server Explorer in Visual Studio to view Cloud-related information. The Windows Azure Storage node initially shows the storage emulator objects. To connect to Azure Storage, you must specify a Storage Account

Here, the Account Key is one of the primary / backup pair, as we saw in Figure 10 of the previous post. Now the list of containers and their contents can be viewed directly from the Visual Studio environment, similar to the Azure Management Portal.
A blob can be opened by selecting the Open item from its context menu either by clicking the open button in the top line or simply by double clicking on the blob. At the same time, it is downloaded to the local temporary directory. Blob can be edited and saved to a local file. To save it (or any other local file) in Azure Storage in VS 2010 there was an Upload Blob button. In 2012, I do not see her point-blank, and I am not alone .
We use the Azure SDK object model to read the file from Azure Storage. With its help, the code is shorter and more readable compared to the REST API.
')
using System;
using Microsoft.WindowsAzure;
using Microsoft.WindowsAzure.StorageClient;
using System.IO;

class program
{
static void Main (string [] args)
{
string storageAccount = "tststorage";
string accountPrimaryKey = "xws7rilyLjqdw8t75EHZbsIjbtwYDvpZw790lda0L1PgzEqKHxGNIDdCdQlPEvW5LdGWK / qOZFTs5xE4P93A5A ==";
string blobContainer = "container1";
string blobName = "AdventureWorks2012.bak"; // "aaaa.txt";

CloudStorageAccount acct = new CloudStorageAccount (new StorageCredentialsAccountAndKey (storageAccount, accountPrimaryKey), true);
CloudBlobClient clnt = acct.CreateCloudBlobClient ();
CloudBlobContainer cntr = clnt.GetContainerReference (blobContainer);
CloudBlob blob = cntr.GetBlobReference (blobName);

blob.DownloadToFile (@ "c: \ temp \ AdventureWorks2012.bak");
}
}
Script 1

Previously it is required to add the Extensions link to the Microsoft.WindowsAzure.StorageClient in the References project.
We say Build, we take out the exe file and the Microsoft.WindowsAzure.StorageClient.dll library from the Bin \ Debug solution folder, copy them through the remote access session to the cloud virtual machine, which does not have Visual Studio or Windows Azure SDK, run the exe, and the AdventureWorks2012 file .bak is downloaded from Azure Storage inside the virtual. By the time it takes about a minute. After that, we open SSMS and restore the backup to SQL Server on the virtual machine in the Cloud.
It should be noted that the described method of transferring a backup from the local machine to the cloud virtual machine via Azure Storage did not cost us anything in terms of traffic, because Ascending traffic to the cloud, by definition, is free, and the Storage Account for intermediate storage of backup was created in the same datacenter as the virtual machine, i.e. Downloading a backup from tststorage to a virtual machine also cost nothing. However, from the point of view of the space occupied by AdventureWorks2012.bak, it was stored twice: at first it was downloaded to the Blob story, then downloaded to vhd, which in fact also has a Blob story. For backups of significant size, this may entail additional costs for space - see pricing, section Storage . In the next post we will look at how to optimize these costs.

Source: https://habr.com/ru/post/156943/


All Articles