Automating migration of SQL to Azure SQL DB using the Azure Database Migration PowerShell module

This post has been republished via RSS; it originally appeared at: Microsoft Data Migration articles.

During public preview of the Azure Database Migration Service (DMS), we released a PowerShell library to automate bulk database migrations from SQL Server to SQL Azure Database. Currently, the Azure DMS documentation includes a “How to” article for Migrating SQL Server on-premises to Azure SQL Database by using Azure PowerShell. However, several customers have asked for information about how to perform and end to end migration, including schema migration and creation of target infrastructure in Azure, all using a PowerShell script. This blog posting is designed to describe one of the paths you can use to accomplish this goal. In it we will use Azure PowerShell, DMS PowerShell module, as well Python with mssql-scripter to migrate three databases located on source SQL Server instance – AdventureWorks2012, AdventureWorks2014, AdventureWorks2016.

 

First, you need to consider a few general prerequisites:

  • Azure DMS requires that an Azure Virtual Network (VNET) be created by using the Azure Resource Manager deployment model, which provides site-to-site connectivity for your on-premises source servers by using either ExpressRoute or VPN. Additional information about creating a VNET is available in the Virtual Network Documentation.
  • After the Azure VNET has been created, you need to ensure that the associated Security Group rules do not block the following communication ports 443, 53, 9354, 445, 12000. For more detail on Azure VNET NSG traffic filtering, see the article Filter network traffic with network security groups.
  • If you are using a firewall appliance in front of your source database(s), you may need to add firewall rules to allow the Azure DMS to access the source database(s) for migration.
  • Windows PowerShell version 5.0 or later is required. To install or update Windows PowerShell to version 5.0 or later, see the article Installing Windows PowerShell.
  • Download and install Azure PowerShell from Azure Downloads https://azure.microsoft.com/en-us/downloads/.
  • Download and install the AzureRM.DataMigration module from the PowerShell Gallery by using the Install-Module PowerShell cmdlet.
  • On the computer hosting the source databases, configure the Windows Firewall for access by the SQL Server database engine, as explained in the article Configure a Windows Firewall for Database Engine Access.
  • The credentials used to connect to source SQL Server instance must have the CONTROL SERVER

After addressing all prerequisites listed above, on the computer your scripts will run on, download and install Python 3.6 for Windows from https://www.python.org/downloads/.

 

1.png

 

By default, the Python 3.6 installer provides the 32-bit version, but there’s also a 64-bit version available. It is recommended to use the 32-bit version. Run through installer, select Custom Installation, and then select to install all Optional Features listed on the installer screen. After the installation is complete, set the system’s PATH variable to include the directories that point to the Python components and packages we’ll add later. To do this:

 

1. In Control Panel, search for Environment; select Edit the System Environment Variables, and then select Environment Variables.

 

2. In the User Variables section, you either need to either edit an existing PATH variable or create one. If you are creating one, make PATH the variable name and add the following directories to the variable values section as shown, separated by a semicolon. If you’re editing an existing PATH, the values are presented on separate lines in the edit dialog. Select New and add one directory per line.

 

C:\Python36-32
C:\Python36-32\Lib\site-packages\
C:\Python36-32\Scripts\

3. Return to the Python installation process, and complete the installation.

 

4. At the command prompt (Start Menu > Windows System > Command Prompt), type Python and then press Enter to load the Python interpreter.

 

2.png

 

Next, you need to install mssql-scripter, which is a multi-platform command line experience for scripting SQL Server databases. This library is similar to the widely used Generate Scripts Wizard experience in SSMS. You can use mssql-scripter on Linux, macOS, and Windows to generate data definition language (DDL) and data manipulation language (DML) T-SQL scripts for database objects in SQL Server. To install mssql-scripter:

 

1. At the Python command line, type pip install mssql-scripter, and then press Enter.

 

2. After the installation is complete, you can verify that the library installed successfully by typing mssql-scripter -h and pressing Enter.

 

You should get usage help output as shown below:

 

3.png

 

3. Now, log in to your Azure subscription, and then start your PowerShell as administrator:

 

4.png

 

4. Beginning in December 2018, the Azure PowerShell Az module is in general release and now the intended PowerShell module for interacting with Azure. Az offers shorter commands, improved stability, and cross-platform support. Az also offers feature parity and an easy migration path from AzureRM. As commands below use AzureRM syntax we should enable\alias that for compatibility purposes in Azure PowerShell.

 

# enable aliasing
     Enable-AzureRmAlias

5. Next let’s make sure that Azure PowerShell and Data Migration modules are installed by running this command:

 

# see all installed modules
     Get-InstalledModule

You should see an output of number of Az modules installed, including Az.DataMigration:

 

5.png

 

6. Run the Login-AzureRmAccount cmdlet to  launch the log in to the Azure portal, and then in the pop-up window, enter the credentials for the Azure subscription:

 

6.png

 

After logging in, you should see following detail showing information about your subscription:

 

7.png

 

7. Use Azure PowerShell to prepare the target environment in Azure:

 

a. Create a Resource Group. Azure resource groups provide a way to monitor, control access, provision and manage billing for collections of assets that are required to run an application or are used by a client or company department. Azure Resource Manager (ARM) is the technology that works behind the scenes so that you can administer assets by using these logical containers.

 

I will declare a couple of variables to hold resource group and Azure region information and then use the New-AzureRmResourceGroup cmdlet to create the resource group.

 

# variables declaration
$resourcegroupname = "my_rg"
$location = "Central US"

#creating the resource group called my_rg in the location East US
$resourceGroup = Get-AzureRmResourceGroup -Name $resourcegroupname -Location $location -ErrorAction SilentlyContinue
if ( -not $ResourceGroup ) {
     New-AzureRmResourceGroup -Name $resourcegroupname -Location $location
     }

 

b. Create the target logical Azure SQL Server that will store the target Azure SQL databases. Again, I will declare couple of variables, additional to those declared in the script above, to hold the server, database, login, and firewall IP information needed as parameters.

 

# variables declaration
$servername = "myServer"
$adminlogin = "admin123"
$password = "p@ssword"

# The ip address range that you want to allow to access your server - change as appropriate
$startip = "0.0.0.0"
$endip = "0.0.0.0"

# database
$databasename = @(" AdventureWorks2012", “AdventureWorks2014”, “AdventureWorks2016”)

# Create Server and firewall
$sqlservername =Get-AzureRmSqlServer -ResourceGroupName $resourcegroupname -ServerName $servername -ErrorAction SilentlyContinue

if ( -not $sqlservername) {

New-AzureRmSqlServer -ResourceGroupName $resourcegroupname  -ServerName $servername -Location $location -SqlAdministratorCredentials $(New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $adminlogin, $(ConvertTo-SecureString -String $password -AsPlainText -Force))}

# Set up firewall

Write-Output "Setting up Firewall Rule"
     New-AzureRmSqlServerFirewallRule -ResourceGroupName $resourcegroupname   -ServerName $servername   -FirewallRuleName "AllowSome" -StartIpAddress $startip -EndIpAddress $endip
     }
     else{
     Write-Output "Azure SQL Server '$servername' already exists"
     }

 

c. Create Azure SQL databases that reside on the logical Azure SQL Server. Use the New-AzureRmSqlDatabase Azure PowerShell cmdlet to run the script below, which will create three AdventureWorks SQL Azure databases (named AdventureWorks2012, AdventureWorks2014, AdventureWorks2016) in Premium Tier (P1). The following example shows you how to easily create multiple target Azure SQL databases on the virtual server in a single migration workflow that are named the same as our source databases on SQL instance.

 

For ($i=0; $i -lt $databasename.Length; $i++) {
                   # Premium Tier.
                   New-AzureRmSqlDatabase  -ResourceGroupName $resourcegroupname  -ServerName $servername  -DatabaseName $databasename[$i] -RequestedServiceObjectiveName "P1"          

}

 

Now you should be able to see target Azure SQL Db virtual server, firewall rule and three target Azure SQL premium tier databases created in your portal.

 

d. Generate and deploy the source database schema by using mssql-scripter. To use that library, we need to construct a dynamic call to this library passing in source database information. In the example below, we are scripting all three source Adventure Works databases located on my SQL Server instance. Our first step will be to construct dynamic call, as a result of below we will create .ps1 PowerShell file and .sql schema file for every database we are scripting in the path c:\temp\migration\. If you don’t have this path on your machine you can use any other folder that you may have proper permissions to. As you can see below, we are connecting to source instance called mySourceSQL with login Admin and password p@ssword1.

 

#generate SQL Schema file for each db
For ($i=0; $i -lt $databasename.Length; $i++) {
$SQLFile ="C:\temp\migration\" + $databasename[$i] +".sql"
$File ="C:\temp\migration\schema_run_" +  $databasename[$i] + ".ps1"
$dbscripter = "mssql-scripter -S mySourceSQL -d " + $databasename[$i] + " -U Admin -P p@ssword1 --script-create --exclude-headers --check-for-existence -f $SQLFile --continue-on-error --target-server-version AzureDB --display-progress --exclude-use-database" | Out-File  $File }

 

8. After .ps1 files are created we should see them in the folder, in my case c:\temp\migration\:

 

8.png

9. Now we will execute these .ps1 scripts via PowerShell to create schema files with .sql extension in the same path.

 

#execute it to script out the schema for all databases
For ($i=0; $i -lt $databasename.Length; $i++) {
$File ="C:\temp\migration\schema_run_" +  $databasename[$i] + ".ps1"
&$File
}

 

You should now see additional files with schema script in the same path:

 

9.png

10. After the schema is scripted, deploy it by using the following script:

 

#setup full name for Azure SQL Server
$servername= $servername + ".database.windows.net"

#deploy schema on target
For ($i=0; $i -lt $databasename.Length; $i++) {
$SQLFile ="C:\temp\migration\" + $databasename[$i] +".sql"
sqlcmd -U $adminlogin -S $servername -P $password -d $databasename[$i] -j -i $SQLFile
}

 

11. Now that the schema is deployed, to migrate the data for our database, create an instance of Azure DMS named TestDMS on a virtual subnet named MySubnet.  Note that we are using sku Standard_4vCores and it expects a full string for Subnet in format  /subscriptions/<subscription ID>/resourcegroups/<resourceGroup>/providers/Microsoft.Network/virtualNetworks/<VNET>/subnets/<subnet>.

 

#create DMS Service
$servicename=”TestDMS”
$sku = "Standard_4vCores"
$virtualSubNetId = "/subscriptions/f24ac2e5-33e9-4f6c-871d-546fcee6c79/resourceGroups/MigrateDemo/providers/Microsoft.Network/virtualNetworks/MyVNET/subnets/MySubnet"

#create service
$service = New-AzureRmDms -Name $servicename -Location $location -ResourceGroupName $resourcegroupname  -VirtualSubnetId $virtualSubNetId -Sku $sku

 

12. The next step is to create an Azure DMS migration project named MyDMSProject. It’s here that we enter the source and target types, the instance, and the source database information needed for the migration:

 

$ProjectName = “MyDMSProject”

#create scource and target connection info
$sourceConnInfo = New-AzureRmDmsConnInfo -ServerType SQL -DataSource "mysourceSQL" -AuthType SqlAuthentication -TrustServerCertificate:$true
$targetConnInfo = New-AzureRmDmsConnInfo -ServerType SQL -DataSource $servername -AuthType SqlAuthentication -TrustServerCertificate:$true

#create database info objects by iterating through source db list
$dbList = [System.Collections.ArrayList]@()
For ($i=0; $i -lt $databasename.Length; $i++) {
    $dbInfo = New-AzureRmDataMigrationDatabaseInfo -SourceDatabaseName $databasename[$i]
    $dbList.Add($dbInfo)
}
$project = New-AzureRmDataMigrationProject -Name ProjectName -Location $location -ResourceGroupName $resourcegroupname -ServiceName $service.Name -SourceType SQL -TargetType SQLDB -SourceConnection $sourceConnInfo -TargetConnection $targetConnInfo -DatabaseInfo $dbList

 

13. The last steps required for migration are to pass credentials and mapping tables between the source and the target, and to run the DMS migration task named myDMSTask, which will migrate the data.

 

a. Start by converting credentials to the PSCredential. Because these are server level, we will use same credential for all databases:

 

$taskName = “myDMSTask”

#convert creds for Source

$secpasswd = ConvertTo-SecureString -String “p@ssword1”  -AsPlainText -Force
$sourceCred = New-Object System.Management.Automation.PSCredential (“Admin”, $secpasswd)

#convert creds for Target
$secpasswd = ConvertTo-SecureString -String @password -AsPlainText -Force
$targetCred = New-Object System.Management.Automation.PSCredential ($adminlogin, $secpasswd)

 

b. Next, map the tables and databases for a task. As all databases in this example are using the same schema, we only need to create one table map (which can be reused) for this migration. For other scenarios, we would create different table map for each task/database migration.

 

#create table map
$tableMap = New-Object 'system.collections.generic.dictionary[string,string]'
$tableMap.Add("HumanResources.Department", "HumanResources.Department")
$tableMap.Add("HumanResources.Employee","HumanResources.Employee")
$tableMap.Add("HumanResources.EmployeeDepartmentHistory","HumanResources.EmployeeDepartmentHistory")
$tableMap.Add("HumanResources.EmployeePayHistory","HumanResources.EmployeePayHistory")
$tableMap.Add("HumanResources.JobCandidate","HumanResources.JobCandidate")
$tableMap.Add("HumanResources.Shift","HumanResources.Shift")

 

c. Now, for each database, we will map databases between the source and the target using SelectedList array list, and then run a task to migrate data. Note that we are mapping databases that are named the same between source and target therefore use same array containing database names.

 

#map selected dbs


$dbSelectedList = [System.Collections.ArrayList]@()
For ($i=0; $i -lt $databasename.Length; $i++) {
 $selectedDb = New-AzDataMigrationSelectedDB -MigrateSqlServerSqlDb -Name $databasename[$i] -TargetDatabaseName $databasename[$i] -TableMap $tableMap
$dbSelectedList.Add($selectedDb)
}
#run a task
$MyTask = New-AzureRmDmsTask -Name "MyDMSTask" -ResourceGroupName $resourcegroupname -ServiceName $service.Name -ProjectName $project.Name -TaskType MigrateSqlServerSqlDb -SourceCred $sourceCred -TargetCred $targetCred -SourceConnection $sourceConnInfo -TargetConnection $targetConnInfo -SelectedDatabase

 

14. You can monitor each migration task running by querying the state property of the task as shown in the following example:

 

 #wait to finish
 $mytask =get-azurermdmstask -Name $migTask.Name -ServiceName $service.Name -ProjectName
 $project.Name -ResourceGroupName $resourcegroupname
 if (($mytask.ProjectTask.Properties.State -eq "Running") -or ($mytask.ProjectTask.Properties.State -eq "Queued")) {
 Start-Sleep -s 15
 }

 

Additional resources

Leave a Reply

Your email address will not be published. Required fields are marked *

*

This site uses Akismet to reduce spam. Learn how your comment data is processed.