In previous articles on migration to the "cloud" SQL Server, we looked at various options for transferring a backup copy of the database to the Cloud. Just in case, you should immediately recall that this is an IaaS approach, i.e. deploying SQL Server on a Windows Azure virtual machine. The alternative PaaS approach is fundamentally different, since The Windows Azure SQL Database (SQL Azure) does not support functionality in the form of regular backup / restore T-SQL commands. There is also no possibility to detach and attach database files (detach / attach). Other methods should be used there, such as DAC, BCP, SSIS, SQL Azure Sync, etc. In this article, we continue to review the methods related to IaaS, and since there is no problem in making / raising a backup, the main working moment is to optimally transfer the backup of the base to the Cloud and back. In this regard, it is no different from downloading or downloading any blob to / from Azure Storage. We considered the use of the Blob Service REST API (http://msdn.microsoft.com/en-us/library/dd135733.aspx), which is very simple in its idea, but rather painstaking in implementation because it requires you to carefully form the PUT body -query. The process is simplified by the wrapper provided by the Azure SDK (http://msdn.microsoft.com/en-us/library/dd179380.aspx), which includes ready-made classes that encapsulate the preparatory work that you need to do manually when using " raw REST API. Finally, we reviewed the process of transferring a backup on a separate vhd-disk, which is connected to the cloud virtual machine. With the release of Cumulative Update 2 to SQL Server 2012 SP1, this process is even more simplified, because Now it has become possible to create backup copies of the database directly in Azure Storage using standard T-SQL commands and, accordingly, restore from them. All non-functional SQL Server service work on transferring the backup to the Cloud is embedded within the T-SQL commands. We only need to have a storage account (Windows Azure Storage Account), which will be used for intermediate storage of the backup.
The process of creating a storage account we discussed in the article devoted to the Blob Service REST API (http://blogs.technet.com/b/isv_team/archive/2012/10/25/3528566.aspx) and now we will not dwell on this in detail. Let me remind you just in case that it makes sense to create it in the same datacenter where the virtual machine with SQL Server is located, on which it is supposed to restore the backup, in order to save on traffic between data centers. For example, in my case, the lvmtest4 virtual machine with the default SQL Server instance is located in the North Europe location. There will also be created an account tststorage for transferring the backup database. The creation process is described in more detail in the Windows Azure documentation (http://msdn.microsoft.com/en-us/library/windowsazure/gg433066.aspx). Inside the account we will create a container1 (capital letters in the name of the container are not supported). For security purposes, the container will be created as private, which means that in order to access its contents, you will need to specify an Access Key (primary or secondary), which can be viewed in the properties of the Storage Account.
Of the prerequisites, we still need CU2 installed on the local SQL Server 2012 to SP1. As usual, cumulative updates will most likely be included in the next service pack, but at the time of this writing it is required to install Service Pack 1 in SQL Server 2012 (if it is not already installed) (http://www.microsoft.com/ en-us / download / details.aspx? id = 35575), and then a cumulative update 2 for this service pack (http://support.microsoft.com/kb/2790947). When we went through this question on Windows Azure Summit, the listeners paid attention to the phrase This service pack contains SQL Server 2012 Cumulative Update 1 (CU1) and Cumulative Update 2 (CU2), so it's probably best to clarify once again. It deals with cumulative updates to the “bare” SQL Server 2012. We need CU2 to SP1. The link opens the form, marks in the form which file is needed (I noted all three) and indicate the e-mail, which literally immediately receives a letter from hotfix@microsoft.com, containing links from where to get it. For our purposes, it is enough to put SQLServer2012_SP1_CU2_2790947_11_0_3339_x64, so that select @@ version was not lower than Microsoft SQL Server 2012 (SP1) - 11.0.3339.0 (X64), Jan 14 2013 19:02:10, Build 9200.
In order for the local SQL Server to be able to access the cloud storage, you must first create a credential on it, which will contain the name of the storage account and one of its access keys (primary or secondary, no difference).
if exists (select 1 from sys.credentials where name = 'SqlToAzureStorage' and credential_identity = 'tststorage') drop credential SqlToAzureStorage
CREATE CREDENTIAL SqlToAzureStorage
WITH IDENTITY= 'tststorage' --storage account
, SECRET = 'oY1AUn6/5/IWz8dfQJzidOVY8HRUKOz1k5MsSnV86xV46fEtQCAigC3Fd8Lgkn2fv6SotsRpZm6w2tRaQVAovw=='
Script 1
Access keys can be found in the account management account in the Windows Azure Management Portal, if you select Configure in the top menu and Manage Keys in the bottom menu.
')
Pic1
Pic2
Actually, everything. Create a backup copy of your favorite AdventureWorks database in the cloud storage.
BACKUP DATABASE AdventureWorks2012
TO URL = 'https://tststorage.blob.core.windows.net/container1/AdventureWorks2012.bak'
WITH CREDENTIAL = 'SqlToAzureStorage', INIT, STATS = 10
Script 2
Everything is clear here. The URL string is formed on the basis of <Azure Storage Account Name>. blob.core.windows.net/ <container name> / <as the backup file created in this container> '' will be called. It can be viewed in the Windows Azure Management Portal in the properties of the container:
Pic.3
If CU2 to SP1 is not installed, when executing this command, the error Msg 155, Level 15, State 1, Line 2 occurs. 'URL' is not a recognized Device Type option. But we installed it, so everything is done successfully.
Pic.4
If we now go to container1 in Figure 3, we will see that AdventureWorks2012.bak backup was created there.
Comments:
• The maximum backup size should not exceed 1 TB, due to the limitations of Azure Blob Storage.
• It is possible to create not only a full backup of the database, but also backup copies of the transaction log, file groups, differential backups, as well as compression of backups.
• If a file with the same name already exists in the container, an error will occur. To rewrite it, use the FORMAT option.
• From the SSMS interface, creating a backup of the database in Azure Blob Storage is not supported yet.
Now we go to the cloud VM with SQL Server installed on it, create an identity (Script 1) and completely symmetrically restore the base on it from the newly created cloud backup. It is clear that the SQL Server volume must also be on SP1 CU2.
RESTORE DATABASE AdventureWorks2012
FROM URL = 'https://tststorage.blob.core.windows.net/container1/AdventureWorks2012.bak'
WITH CREDENTIAL = 'SqlToAzureStorage', REPLACE, STATS = 10
Script 3
Pic.5
Similarly, you can do the opposite: create a backup in the Cloud and raise it to an on-premise SQL Server, as well as transfer the base between cloud or on-premise machines.