Migrating TDE Databases to Azure SQL Database Managed Instance – Part 1

Imagine the scenario, we have been asked to move our current on-premises database to Azure SQL Database Managed Instance, but how should we go about it?


Before we begin, I just want to make sure that we are starting from the same place. This post is working on the following assumptions.

  1. That the source version of SQL Server is 2012 or higher.
  2. That Transparent Data Encryption (TDE) is already in place.
  3. That we are not using the Key Vault EKM provider to manage TDE encryption keys.
  4. The Azure SQL Database Managed Instance has been provisioned and is ready for us to work with.


All of the scripts used in this post can be downloaded from GitHub here.

Migrating the Database

When moving to Managed Instance there are two main routes that can be taken. Either manually backup and restore the database via Azure Blob Storage, or use the Database Migration Service (DMS). The choice for which of these to go with will depend on the amount of downtime that you can handle for the migration cut-over.

Both of these options will be able to move a database with TDE enabled. However, we need to ensure that the master key and certificates are in place to make sure we can restore the database.

Migrating TDE Certificate

Before we can migrate the certificate we need to identify which one on the server is applicable to the database we want to migrate. If we do not migrate this certificate then the Managed Instance will not be able to restore the database from the backup file, we will get an error similar to the following.

We can use the following T-SQL statement to find the appropriate certificate in the master database which we need to move.

USE [master];

DECLARE @DatabaseName sysname = 'Adventureworks2019_TDE';

SELECT dek.database_id,
        DB_NAME(dek.database_id) AS DbName,
FROM sys.dm_database_encryption_keys AS dek
    JOIN sys.certificates AS c
        ON dek.encryptor_thumbprint = c.thumbprint
WHERE dek.database_id = DB_ID(@DatabaseName)

The result set can be seen below where we can see the certificate being used for the specific database we want to move. You will also notice that the thumbprint in the results is the same as the error in the message earlier.

Grid output highlighting Database and Certificate names for database encryption.

Now that we have identified the certificate that we need to migrate we can move it from our SQL Server to the Managed Instance. There are three stages we need to step through in order to do this successfully, these are.

  1. Backup the Certificate with Private Key from the source server.
  2. Convert the *.cer and *.pvk files into a *.pfx file.
  3. Load the *.pfx file into Managed Instance.

Step 1 – Backup Certificate & Key to File

It is possible to use the BACKUP CERTIFICATE command in T-SQL to export the certificate and it’s private key to files. However, we will look to use DBATools for this as it means we can use one PowerShell script to do all of the work for us. The script below needs us to set a couple of variables and then it will out put two files to the location we specify.

$sourceSqlServer = 'SQL-01'
$sourceDatabase = "master"
$CertName = "TdeServerCert01"
$certBackupPath = "C:\SQLServer\Backup"
$certSecurePassword = Read-Host -Prompt "Please provide the password for the certificate." -AsSecureString
$certBackupParams = @{
    SqlInstance = $sourceSqlServer
    Database = $sourceDatabase
    Certificate = $CertName
    EncryptionPassword = $certSecurePassword
    Path = $certBackupPath
$certBackup = Backup-DbaDbCertificate @certBackupParams

This will export the certificate as a *.cer file and the private key as a *.pvk file which is password protected with the password supplied. Here we are also creating a variable which is loaded with the object from the backup cmdlet so that we can reference properties of it later in the script.

Step 2 – Covert to PFX File

Managed Instance will not let us use the standard restore certificate process which we would use if moving between retail SQL Server systems. Instead we need to use the Add-AzSqlManagedInstanceTransparentDataEncryptionCertificate cmdlet and this cannot take the *.cer and *.pvk files as input.

In order to perform this conversion we need to make use of the development tool pvk2pfx.exe. This is not installed by default and comes as part of the Windows Driver Kit or Windows SDK. We can download the Windows SDK for Windows 10 here. Once downloaded then we only need to install the C++ components and this will place the tool we need into C:\Program Files (x86)\Windows Kits\10\bin\10.0.19041.0\x64.

Once this has been installed we can execute the next part of our script which will convert the output from the backup in Step 1 into the PFX file we need.

$pvk2pfxLocation = 'C:\Program Files (x86)\Windows Kits\10\bin\10.0.19041.0\x64'
$pfxOutput = "$certBackupPath\$CertName.pfx"
$convertCertParams = @{
    ScriptBlock = { Set-Location $pvk2pfxLocation
         .\pvk2pfx.exe -pvk $args[0] -pi $args[1] -spc $args[2] -pfx $args[3]
        ArgumentList = $certBackup.Key, (Read-Host -Prompt "Please provide the password for the certificate."), $certBackup.Path, $pfxOutput
Invoke-Command @convertCertParams

During this process we are prompting for the password. This is preferable to writing it into the script or to store it in a variable. Unfortunately because the pvk2pfx tool is a standalone executable it means we cannot use the secure string PowerShell type for the password here.

Once this process has been completed then we will have a *.pfx file in our backup directory at which point we can move to the last step of this section, adding the certificate to our Managed Instance.

Step 3 – Uploading the PFX file to Managed Instance

Because Managed Instance is a PaaS offering in Azure it means there is no file system for us to work with to restore the certificate from. As such there is a dedicated PowerShell cmdlet which we can use to upload the data directly to Managed Instance.

The Add-AzSqlManagedInstanceTransparentDataEncryptionCertificate cmdlet, (thank goodness for Intellisense & autocomplete) requires that we pass in the *.pfx as a blob, with the password and Managed Instance we need it added to. Building on the scripts we have already run to generate the file we need, the following will convert it and add it to our Managed Instance.

$pfxBinData = [System.Convert]::ToBase64String([IO.File]::ReadAllBytes($pfxOutput)) | ConvertTo-SecureString -AsPlainText -Force
$managedInstanceName = "mi-migrations"
$managedInstanceTdeCertparams = @{
    ResourceGroupName = ((Get-AzSqlInstance -Name $managedInstanceName).ResourceGroupName)
    ManagedInstanceName = $managedInstanceName
    PrivateBlob = $pfxBinData
    Password = $certSecurePassword
Add-AzSqlManagedInstanceTransparentDataEncryptionCertificate @managedInstanceTdeCertparams

Now that is complete we are in a position to migrate our database from our on-premises SQL Server system to Managed Instance.

Migrating the Database to Managed Instance

As mentioned earlier, we have two options for moving a database to Managed Instance. Manually doing a backup and restore via Azure Blob Storage, or DMS. We are going to do an ‘offline’ migration to Managed Instance via backup & restore here. If we needed to perform an ‘online’ migration where downtime is minimised then we would use DMS.

There are two steps to this process, the first being to setup the Azure Storage account for us to use as a backup point. The second is to backup & restore the database.

Step 1 – Configure Azure Storage & Credentials

Before we can perform the database migration we need to configure Azure Blob Storage to act as a staging area for our backup file. Ideally, we should make sure that the storage account we create is in the same Azure region as the Managed Instance that we will be restoring the database to. We can create the storage account with PowerShell using the following script.

Create storage account for us to backup to.
 $newStorageAccountParams = @{
     ResourceGroupName = "rg-migrations" 
     Name = "stmigblogpost" 
     SkuName = "Standard_LRS"
     Location = 'UK South' 
     Kind = "StorageV2" 
     AccessTier = "Hot"
     MinimumTlsVersion = "TLS1_2"
     AllowBlobPublicAccess = $false
 $storageAccount = New-AzStorageAccount @newStorageAccountParams
 Create a container in the storage account for the backup files.
 $newStorageAccountContainerParams = @{
     Name = "dbbackups"
     Context = $storageAccount.Context
 $container = New-AzStorageContainer @newStorageAccountContainerParams

This has created the account and a container for us to place the backups into. Once this is in place we need a sas token which will be used to grant access to this container by the source and destination systems via credential objects. The following PowerShell will create and store the sas token in a variable for us.

$newSasTokenParams = @{
     Context = $storageAccount.Context
     Service = "Blob"
     ResourceType = "Container","Object"
     Permission = "racwdlup"
     StartTime = ((Get-Date).AddDays(-1))
     ExpiryTime = ((Get-Date).AddDays(7))
 $sasToken = New-AzStorageAccountSASToken @newSasTokenParams

In this we are creating a token which will be valid as soon as we create it and for seven days after. this time span will vary depending on the time we expect to take for performing the migration activities.

Now we have all of the Azure storage elements in place and a sas token for access we can create credentials on the SQL Server and Managed Instance. These credentials will allow the database engine to backup to URL and restore from URL. DBATools has a couple of great commands which make this process a lot easier which are in the script below.

$newDbaCredentialParams = @{
     SqlInstance = "SQL-01"
     Name = $container.CloudBlobContainer.Uri.AbsoluteUri
     Identity = "Shared Access Signature"
     SecurePassword = (ConvertTo-SecureString -AsPlainText -Force ($sasToken.Replace("?","")))
 $credential = New-DbaCredential @newDbaCredentialParams
 $copyCredentialParams = @{
     Source = "SQL-01"
     Destination = "mi-migrations.public.000000000000.database.windows.net,3342"
     DestinationSqlCredential = (Get-Credential) 
     Name = $credential.Name
 Copy-DbaCredential @copyCredentialParams

Note: When the sas token is returned it will have a leading “?” which needs to be removed for the credential objects to work properly.

First of all we create the credential on the source system, then copy that to the destination. This way we know that the credential objects are the same and access is valid on both systems. Additionally, we am using the public endpoint for the Managed Instance here. If we have hybrid network connectivity setup for our workstation we would be able to use the private endpoint in this command.

Now we are all set for doing the backup and restore.

Step 2 – Migrate the Database

Again we will work with DBATools here to allow us to build on the scripts we used to setup the storage and credentials. This time we will first of all take the backup with Backup-DbaDatabase.

$backupDatabaseParams = @{
     SqlInstance = "sql-01"
     Database = "Adventureworks2019_TDE"
     AzureBaseUrl = $container.CloudBlobContainer.Uri.AbsoluteUri
     Type = "Full"
$backup = Backup-DbaDatabase @backupDatabaseParams

This will create the backup file in the storage container we created via the backup to URL process. We are also creating a variable and storing the backup object in this so that we can easily obtain properties needed for the restore.

Now that we have a backup in place we can restore it to our Managed Instance. This time we will use Restore-DbaDatabase from the DBATools module.

$restoreDatabaseParams = @{
     SqlInstance = "mi-migrations.public.000000000000.database.windows.net,3342"
     SqlCredential = (Get-Credential)
     DatabaseName = "Adventureworks2019_TDE"
     Path = $backup.BackupPath
Restore-DbaDatabase @restoreDatabaseParams

Once this command completes we will have migrated the database to the new Managed Instance. All that remains now is to migrate the supporting objects such as jobs, logins, etc. and redirect the applications and the migration is complete. More information on how to go about migrating these objects with DBATools can be found here.

Finally, we just need to clear up the credentials, backup files and storage account and then decommission the old on-premises SQL Server.


In this post we have looked at how to successfully migrate a SQL Server database which is protected with Transparent Database Encryption (TDE) to Azure SQL Database Managed Instance. We looked at how to identify the prerequisites and leap over the hurdles of dealing with a new form of SQL Server.

3 thoughts on “Migrating TDE Databases to Azure SQL Database Managed Instance – Part 1

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.