Migrate SQL Server to Azure SQL Managed Instance at Scale

This post has been republished via RSS; it originally appeared at: New blog articles in Microsoft Community Hub.

In this blog post, we'll walk you through the entire process of migrating your on-premises databases to Azure SQL Managed Instances. We'll explore a powerful PowerShell script that simplifies the migration, allowing you to seamlessly transition your On-Premises SQL Server workloads to Azure SQL Managed Instance. This script leverages the capabilities of Azure Data Migration Service (DMS) in combination with Azure Blob Storage to ensure a smooth and efficient database migration journey. Additionally, network file share too can be used for end-to-end migration of databases from On-Prem File Share to Azure SQL Managed Instance.



Before diving into the migration process, ensure you have the following prerequisites in place:

     Script Requirements:


  • PowerShell

The following PowerShell modules should be installed on the data collector machine:






  • Migration Service

An Instance of Database migration service and migration project is required to migrate the database from your backup location (Blob Storage or Network File Share) to the Managed Instance. Refer to the following URL to learn how to create a Migration Service.



  • Azure Subscription
  • Access to an Azure subscription where the Azure SQL Managed Instance is created.
  • The owner or Contributor role for the Azure SQL Managed Instance.
  • A provisioned Azure SQL Managed Instance with write access to restore databases.

Preparing for the migration:

The PowerShell script snippet is at the end of the blog. Please follow the excel screenshots to create the csv files.


Ensure FilestoUpload and Migration-Config csv files have been updated with right set of configuration values. 

  • Blob Storage:
    • Read and write access to the Azure storage account.
    • A storage location to temporarily store the backup files during migration.
    • Create an Azure Storage Account for creating a blob container.
      • Full, differential, and log backups of each database should be placed in the same container.
      • Existing containers can also be used to start data migration.
  • File Share:
    • A storage location in on-premises to temporarily store the backup files during migration.

Note: For every individual database, there should be Individual folders containing all (full, differential, and log) backup files without sub-directories 


Configuration Files:


This csv file will be used to transfer files from on premise directory to the storage account. Make sure to fill the necessary details in columns as described below.




SubscriptionID – Subscription ID of the storage account.

           ResourceGroupName: Name of the Resource group.

           IsActive: Value can be either 1 or 0 respectively for true or false. Set to 0 if the DB need not be uploaded during the upload process.

           StorageAccountName: Name of the Storage Account to copy backup files. 

           ContainerName (optional): Provide name of the container.

For each database, the respective container holds backup files in the Azure Storage account. If empty, the default naming convention would be ServerName-DatabaseName 

           ServerName: Hostname containing SQL Server instance. 

           DatabaseName: Name of the database.

           BackupPath: Full directory path containing all backups.

           FolderName: Existing folder name containing backup files (Default DatabaseName)



This file contains the Server(s) and Database(s) migration information and will be used to start the migration to the Managed Instance.




SubscriptionId:  Subscription Id of Azure portal containing all resources.

         IsActive: Value can either be 1 or 0 respectively for true or false.

         SourceSqlConnectionDataSource:   Host/Instance, name of the source machine. It should be same as another file to locate the backup folders from Azure Storage account.

        DatabaseName: Source Database name.

        StorageAccountName: Name of the Storage Account containing backup files.

       StorageAccountKey (optional): In case of restricted access, the primary key of the storage account to get access to the containers.

      ContainerName (optional): Container name that stores the appropriate backup files. The default naming convention would be "SourceSqlConnectionDataSource -DatabaseName"

       MigrationServiceName: DMSV2 service name.

      ResourceGroupName: Name of the Resource group.

      ManagedInstanceName: Name of the Azure SQL Managed Instance.

     AutomaticCutover: (0/1), if True, the monitoring script will initiate the cutover as soon as DMS finishes the database restoration process.

NOTE: Typically, during migration, there may be several databases and if there is need to monitor or cutover, 0 or 1 can be used for respective need. When 0 is specified, the script will run only in monitoring mode and you have initiated the cutover manually once you are satisfied with the migration, alternatively, 1 can be used to proceed with cutover automatically as soon as the database restore is complete. This can be useful in case of single large databases where there are not differential or transactional log back up files.


Script Execution and Setup

The below image provides the detailed dependent modules for the script execution.




Please see the significance of the different files below:

  • Invoke-Migration: This PowerShell script has the details of the automation.
  • Config: Folder contains CSV files to mention target SQL instance details



Script Execution Steps


  • Ensure all prerequisites are met.
  • Right-click on the Invoke-Migration.ps1 file to open in PowerShell ISE and Click on the play button to load all modules in memory.
  • The prompt will appear to connect with the Azure subscription from the VM that you will use to run the PowerShell script.


If any error occurs related to execution, perform the steps below to troubleshoot, and allow PowerShell execution.

  • You can bypass the execution policy at the session level by running the command below on the PowerShell prompt.

Set-ExecutionPolicy -Scope Process -ExecutionPolicy Bypass




       If you are not allowed to change the execution policy from a global domain policy, then you         can have the workaround by executing the below commands.

                               Unblock-File -Path.\ Invoke-Migration.ps1

Once the execution policy is configured, click on the play button to load all modules in memory.


How to Run the Assessment Tool:

       The program processing logic and information flow are illustrated in the diagram below.



     Copy backup files:

  • It is necessary to copy the backup files of each database to respective containers in the storage account. To achieve the same, make sure information is filled in the csv (FilestoUpload.csv) file.
  • To copy backup files to the storage account, the “AzCopy” utility should be present in the tool folder.

           To download the AzCopy, click here




Run the below command to start copying backup files to the azure storage account:



   Start Migration

  • Before starting the migration ensure you have copied all of the database backup files into the container that will be used by the Database migration service to restore databases from Azure blob storage to the managed instance.
  • Migration Script is the starter script of the Migration process. It will start the migration process as per the given data on the Migration-Config.csv file.
  • To run the migration, use the command below:


          Note: Migration will fail if backup files are not found in the given blob container.

          After Migration Started run the monitoring script to check the status of the migration.


Migration Monitoring/cutover:

The final step of the tutorial is to complete the migration cutover. The completion ensures the migrated database in SQL Server on Azure Virtual Machine is ready for use. Downtime is required for applications that connect to the database and the timing of the cutover needs to be carefully planned with business or application stakeholders.


To Complete the cutover:

  • Stop all incoming transactions to the source database.
  • Make application configuration changes to point to the target database in SQL Server on Azure Virtual Machine.
  • Take any tail log backups for the source database in the backup location specified.
  • Ensure all database backups have the status Restored on the monitoring details page.
  • Select Complete Cutover on the monitoring details page.
  • During the cutover process, the migration status changes from in progress to completing. The migration status changes to succeed when the cutover process is completed. The database migration is successful, and the migrated database is ready for use.
  • The monitoring script will help to know the status of the migration of the database and will initiate the automatic cutover for the database whose AutomaticCutover value was true in the Migration-Config.csv file.

          You can monitor/cutover using the Powershell command or Azure Portal:

                         Monitor:         Invoke_DMS_MI_MigrationStatus

                        Cutover:          Invoke_DMS_MI_MigrationCutover


            Azure Portal:

                 You can also navigate to the Azure Portal> Database migration service > Migration project and then click on the databases to see the progress.

                 Select Complete Cutover on the monitoring details page for the final cutover.






cls #Install-Module -Name Az.Sql -AllowClobber #Install-Module -Name Az.Storage -AllowClobber #Install-Module -Name Az.Accounts -AllowClobber #Install-Module -Name Az.DataMigration -AllowClobber Import-Module Az.DataMigration Import-Module Az.Storage Import-Module Az.Accounts #Update-AzConfig -DisplayBreakingChangeWarning $false <# To Validate Azure Subscription access : Verify_Azure_Access Using DMS : To UPLOAD backup files to Azure Blob storage : Invoke_DMS_UploadToBlobStorage To Start migration to the Managed Instance : Invoke_DMS_MI_MigrationFromBlobtoMI To get the migration status : Invoke_DMS_MI_MigrationStatus To perform the migration cutover : Invoke_DMS_MI_MigrationCutover #> $ErrorActionPreference = 'Continue' # Generates Log File Name and DateTime $dateTimeString=Get-Date -Format "yyyyMMddHHmmssffff" $LogFile = "$PSScriptRoot\Logs\MigrationLogs$dateTimeString.log" $ErrorLogFile = "$PSScriptRoot\Logs\ErrorLogs$dateTimeString.log" $dateTime = Get-Date -Format "yyyyMMddHHmmssffff" $date = Get-Date -Format "yyyy-MM-dd HH:mm:ss" function Verify_Azure_Access { [Cmdletbinding()] Param( [Parameter(Mandatory=$false,Position=0,HelpMessage="Please specify the Subscription Id")] [string]$FirstID ) if($FirstID -eq "") { $ServerInfo=Import-Csv -Path "$PSScriptRoot\Config\FilestoUpload.csv" | ? isActive -eq "1" #Path to CSV Containing folder backup path $SubID = $ServerInfo.SubscriptionId if($SubID.Count -gt 1) { $FirstID= $SubID.Item(1) } else { $FirstID= $SubID } } $successfulllogin = Connect-AzAccount -Subscription $FirstID if($successfulllogin) { "Successfully login to the susbcription." } else { return; } } # function to upload files from fileshare to the blob storage function Invoke_DMS_UploadToBlobStorage { $ServerInfo=Import-Csv -Path "$PSScriptRoot\Config\FilestoUpload.csv" | ? isActive -eq "1" #Path to CSV Containing folder backup path $SubID = $ServerInfo.SubscriptionId if($SubID.Count -gt 1) { $FirstID= $SubID.Item(1) } else { $FirstID= $SubID } Verify_Azure_Access($FirstID); #Loops through the file names, creates container in Azure Storage and uploads file using AzCopy foreach($file in $ServerInfo) { try { $ServerName = $file.ServerName.replace("\","-").ToLower(); $DatabaseName = $file.DatabaseName.ToLower(); $SContainerName = $file.ContainerName; $BFolderName = $file.FolderName; $localPath = $file.BackupPath.ToLower(); #Path containining backup files if($SContainerName -eq "") { $ContainerFolderName = $ServerName+"-" +$DatabaseName $BFolderName = $DatabaseName.ToLower(); } else { $ContainerFolderName=$SContainerName.ToLower(); } if($BFolderName -eq "") { $BFolderName = $DatabaseName.ToLower(); } $storageAccountRG = $file.ResourceGroupName; $storageAccountName = $file.StorageAccountName; #Get Storage Account Key $storageAccountKey = (Get-AzStorageAccountKey -ResourceGroupName $storageAccountRG -AccountName $storageAccountName).Value[0] $destinationContext = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey $ctx = New-AzStorageContext -StorageAccountName $storageAccountName -UseConnectedAccount # check whether the container exists. $getcontainer= (Get-AzStorageContainer -Context $ctx -Name $ContainerFolderName -ErrorAction SilentlyContinue) # create container if does not exist if ($getcontainer.Name -ne $ContainerFolderName) { $CreatingContainer= New-AzStorageContainer -Name $ContainerFolderName -Context $destinationContext } else { Write-Host "Container " $ContainerFolderName " already exists in the azure storage account " $storageAccountName } # Generate SAS URI $containerSASURI="" $containerSASURI = New-AzStorageContainerSASToken -Context $destinationContext -ExpiryTime(get-date).AddSeconds(3600) -FullUri -Name $ContainerFolderName -Permission rw # $FileToUpload = $localPath + '\'+$FileName # Write-Host $FileToUpload Write-Host "" #code to get container files $ContainerFileNames = Get-AzStorageBlob -Container $ContainerFolderName -Context $destinationContext | select Name #code to get all folder directories to compare $FolderSList="" $FolderSList = (Get-ChildItem $localPath -Directory) | Where-Object {$_.Name -eq $BFolderName} foreach($DirectoryName in $FolderSList) #Loops through the directories and uploads file using AzCopy { $DName = $DirectoryName.Name.ToLower(); #compareing with the azure container if($DName -eq $BFolderName) { $BackupPath =""; if($BFolderName -eq "") { $BackupPath = $localPath+'\'+$DName } else { $BackupPath = $localPath+'\'+$BFolderName } $BAckupFiles = Get-ChildItem $BackupPath #| select Name #Loops through the file names and uploads file using AzCopy foreach($BAckupFileNAme in $BAckupFiles) { $BkFileName = $BAckupFileNAme.Name.ToString(); try { $carray = $ContainerFileNames | Where-Object -Property Name -in $BkFileName if($carray) { # "File already exist on the container" } else { $FilePath= $BackupPath +"\"+$BkFileName $app=""; $ArgumentList=""; $app = "$PSScriptRoot\azcopy.exe " $ArgumentList = " copy """+$FilePath +""" """ + $containerSASURI +""" --overwrite false" Write-Host "Logging Time:$date AzCopy Starting for file $BkFileName" Add-Content -Path $LogFile "Logging Time:$date AzCopy Starting for file $BkFileName" Start-Process -FilePath $app -ArgumentList $ArgumentList -Wait Write-Host "Logging Time:$date AzCopy finished Uploading file $BkFileName" Add-Content -Path $LogFile "Logging Time:$date AzCopy Finished Uploading for file $BkFileName" } } catch { $errorMessage = $_.Exception.Message $errorRecord = $_.Exception.Record $logMessage = "[$date] Error: $errorMessage`n$errorRecord filename: $FileName" Add-Content -Path $ErrorLogFile -Value $logMessage } } } } } catch { Write-Host $Error $errorMessage = $_.Exception.Message $errorRecord = $_.Exception.Record $logMessage = "[$date] Error: $errorMessage`n$errorRecord filename: $FileName" Add-Content -Path $ErrorLogFile -Value $logMessage } } } #Function to initiate migration function Invoke_DMSMigrationFromBlobtoMI { $MigrationConfig = Import-Csv -Path "$PSScriptRoot/Config/Migration-Config.csv" | ? isActive -eq "1" $SubID = $MigrationConfig.SubscriptionId if($SubID.Count -gt 1) { $FirstID= $SubID.Item(1) } else { $FirstID= $SubID } Verify_Azure_Access($FirstID); foreach($DB in $MigrationConfig) { $SubscriptionId = $DB.SubscriptionId; $ResourceGroupName = $DB.ResourceGroupName.ToLower(); $StorageAccountName = $DB.StorageAccountName.ToLower(); $MigrationServiceName = $DB.MigrationServiceName.ToLower(); $MIName = $DB.ManagedInstanceName.ToLower(); $DatabaseName = $DB.DatabaseName.ToLower(); $ContainerName = $DB.ContainerName.ToLower(); $StorageAccountKey = $DB.StorageAccountKey #Get Storage Account Key if( $StorageAccountKey -eq "") { $storageAccountKey = (Get-AzStorageAccountKey -ResourceGroupName $ResourceGroupName -AccountName $StorageAccountName).Value[0] } $StorageAccountID = "/subscriptions/$SubscriptionId/resourceGroups/$ResourceGroupName/providers/Microsoft.Storage/storageAccounts/$StorageAccountName"; $MigrationServiceName= "/subscriptions/$SubscriptionId/resourceGroups/$ResourceGroupName/providers/Microsoft.DataMigration/SqlMigrationServices/$MigrationServiceName"; $ManagedInstance= "/subscriptions/$SubscriptionId/resourceGroups/$ResourceGroupName/providers/Microsoft.Sql/managedInstances/$MIName"; if($ContainerName -eq "") { if($DB.SourceSqlConnectionDataSource -eq "") { $ContainerName=$DatabaseName.ToLower() } else { $ContainerName=$DB.SourceSqlConnectionDataSource.replace("\","-").ToLower()+"-"+$DatabaseName } } # Initiate an Empty Hashtable to store migration information for each database $NewDatabaseMigrationParameters=@{} $NewDatabaseMigrationParameters.ResourceGroupName = $ResourceGroupName $NewDatabaseMigrationParameters.ManagedInstanceName = $MIName $NewDatabaseMigrationParameters.TargetDbName = $DatabaseName $NewDatabaseMigrationParameters.Kind = "SqlMI" $NewDatabaseMigrationParameters.Scope = $ManagedInstance $NewDatabaseMigrationParameters.MigrationService = $MigrationServiceName $NewDatabaseMigrationParameters.AzureBlobStorageAccountResourceId =$StorageAccountID $NewDatabaseMigrationParameters.AzureBlobAccountKey = $StorageAccountKey $NewDatabaseMigrationParameters.AzureBlobContainerName = $ContainerName $DbName = $DatabaseName #for cutover Write-Host "Logging Time:$date Starting Migration for Database:$DbName" Add-Content -Path $LogFile "Logging Time:$date Starting Migration for Database:$DbName" try { # Starts the Migration for each database New-AzDataMigrationToSqlManagedInstance @NewDatabaseMigrationParameters -ErrorAction Continue -NoWait } catch { Write-Error "$_" -ErrorAction Continue Write-Host $Error $errorMessage = $_.Exception.Message $errorRecord = $_.Exception.Record $date = Get-Date -Format "yyyy-MM-dd HH:mm:ss" $logMessage = "[$date] Error: $errorMessage`n$errorRecord databasename: $dbName" Add-Content -Path $ErrorLogFile -Value $logMessage } } } #function to get the migration status function Invoke_DMSMigrationStatus { $MigrationConfig = Import-Csv -Path "$PSScriptRoot/Config/Migration-Config.csv" | ? isActive -eq "1" foreach($Server in $MigrationConfig) { try { $MigrationDetails=Get-AzDataMigrationToSqlManagedInstance -ManagedInstanceName $Server.ManagedInstanceName -ResourceGroupName $Server.ResourceGroupName -TargetDbName $Server.DatabaseName -Expand MigrationStatusDetails -WarningAction SilentlyContinue $ExpanDetails =$MigrationDetails.MigrationStatusDetail $MgState = $ExpanDetails.MigrationState # $Migration = $MgState.MonitorMigration $BlobContainerName = $ExpanDetails.BlobContainerName $CurrentRestoringFilename = $ExpanDetails.CurrentRestoringFilename $LastRestoredFilename = $ExpanDetails.LastRestoredFilename $ProvisioningState =$MigrationDetails.ProvisioningState $MigrationStatus =$MigrationDetails.MigrationStatus $ManagedInstanceName =$Server.ManagedInstanceName $DatabaseName =$Server.DatabaseName Write-Host " Database Restore Status : "$ManagedInstanceName " - "$DatabaseName " MonitorMigration : $MgState ProvisioningState : $ProvisioningState BlobContainerName : $BlobContainerName LastRestoredFilename : $LastRestoredFilename CurrentRestoringFilename : $CurrentRestoringFilename Restore Status is : $MigrationStatus " } catch { Write-Error "$_" -ErrorAction Continue } } } #function to perform the migration cutover function Invoke_DMSMigrationCutover { $MigrationConfig = @() $MigrationConfig = Import-Csv -Path "$PSScriptRoot/Config/Migration-Config.csv" |? AutomaticCutover -eq "1" | ? isActive -eq "1" $MigrationConfigCount= ($MigrationConfig | Measure-Object).Count if($MigrationConfigCount -gt 0) { foreach($Server in $MigrationConfig) { $MigrationDetails=Get-AzDataMigrationToSqlManagedInstance -ManagedInstanceName $Server.ManagedInstanceName -ResourceGroupName $Server.ResourceGroupName -TargetDbName $Server.DatabaseName -Expand MigrationStatusDetails -WarningAction SilentlyContinue $ExpanDetails =$MigrationDetails.MigrationStatusDetail $ProvisioningState =$MigrationDetails.ProvisioningState $MigrationStatus =$MigrationDetails.MigrationStatus $ManagedInstanceName =$Server.ManagedInstanceName $DatabaseName =$Server.DatabaseName if($MigrationStatus -eq "ReadyForCutover" -and $ExpanDetails.CurrentRestoringFilename -eq $ExpanDetails.LastRestoredFilename) { Write-Host "Logging Time:$date Starting Cutover for Database: $ManagedInstanceName - $DatabaseName" Add-Content -Path $LogFile "Logging Time:$date Starting Cutover for Database: $ManagedInstanceName - $DatabaseName" Invoke-AzDataMigrationCutoverToSqlManagedInstance -ManagedInstanceName $ManagedInstanceName -ResourceGroupName $Server.ResourceGroupName -TargetDbName $DatabaseName -MigrationOperationId $MigrationDetails.MigrationOperationId -WarningAction SilentlyContinue Write-Host "Logging Time:$date Ending Cutover for Database: $ManagedInstanceName - $DatabaseName" Add-Content -Path $LogFile "Logging Time:$date Ending Cutover for Database: $ManagedInstanceName - $DatabaseName" } else { Write-Host "Logging Time:$date Database ( $ManagedInstanceName - $DatabaseName ) already restored and online state " } } } else { Write-Host "No Databases are selected to perform cutover.Please specify AutomaticCutover column value to 1 in the config file." } }



Feedback and suggestions


If you have feedback or suggestions for improving this data migration asset, please contact the Data SQL Ninja Engineering Team (datasqlninja@microsoft.com). Thanks for your support!

Note: For additional information about migrating various source databases to Azure, see the Azure Database Migration Guide.

Leave a Reply

Your email address will not be published. Required fields are marked *


This site uses Akismet to reduce spam. Learn how your comment data is processed.