skip to Main Content

SQL Server Backup to URL

There are strong synergistic benefits between SQL Server and Azure storage and this fact has led Microsoft to give us many techniques which allow us to create a local SQL Server backup database to Azure storage. There are two main ways that stand out; file snap-shot backups for database files in Azure blob storage, and SQL Server backup to URL. We will be discussing the latter of the two.

Benefits of SQL Server backup to URL

Why would you want to use Azure storage as the destination for your SQL Server backup?

      • No hardware overhead: There is no overhead of hardware management. Azure services manage the hardware from a storage perspective.
      • High storage capacity: By backing up directly to Azure storage you will be able to access 500TB of space per storage account.
      • Backup availability: Backups stored in Azure blobs are available from anywhere and at any time.
      • Storage Replication benefits: Assuming you have setup Geo-Redundant Storage (GRS) for your storage account, it is designed to provide at least 99.99999999999999% (16 9’s) durability of objects over a given year by replicating your data to a secondary region that is hundreds of miles away from the primary region.
      • Cost: This can be cost-effective as an off-site and backup archive option, which would be the case for my SQL Server backups.


To successfully set up a SQL Server backup to URL you will need to have the following, which we will cover in the upcoming sections.

      • A Microsoft Account.
      • Storage Account within Azure.
      • Container for your backups.
      • SQL Server credential.

Set up

Login to your Microsoft Azure account via the Azure portal and navigate to Storage accounts.

You will be presented with the main overview page. You will need to click the blobs option.

From here you will then need to create a container to hold your database backups. It is Microsoft best practice to set access to private for this container.


There are a couple of important things needed now. You will need the storage account name, the blob service endpoint and the storage access key (key 1 or key 2) which can all be found within the portal as highlighted via the black boxes below.

The code

Now we have all the necessary information to write the T-SQL to a local SQL Server backup database in Azure Storage using storage account identity and access key technique.

The first step is to create a SQL Server credential where the IDENTITY is the name of the storage account and the secret being either key 1 or key 2 (obfuscated for this article).

CREATE CREDENTIAL Backupcredential  
WITH IDENTITY= 'cloudspacestorage'

Within SSMS (SQL Server Management Studio) you can confirm the creation of the credential.

Then finally the code for the actual backup where we state the endpoint via “TO URL” whilst using the credential from the step above.

TO URL = ''  
  WITH CREDENTIAL = 'Backupcredential', STATS = 10

Processed 312 pages for database 'TestDB', file 'TestDB' on file 1.
100 percent processed.
Processed 3 pages for database 'TestDB', file 'TestDB_log' on file 1.
BACKUP DATABASE successfully processed 315 pages in 0.691 seconds (3.561 MB/sec).


You can confirm that the backup has been created by connecting to the storage account which I have done via SSMS below.


Restoring from the backup file that is located in Azure Storage is very similar to that of a normal restore statement. The key differences being the source location and the use of the credential.

USE [master]
  CREDENTIAL = 'Backupcredential'

Processed 312 pages for database 'TestDBCopy', file 'TestDB' on file 1.
Processed 3 pages for database 'TestDBCopy', file 'TestDB_log' on file 1.
RESTORE DATABASE successfully processed 315 pages in 0.136 seconds (18.095 MB/sec).

What about SAS?

Technically you could go a different route and use a Shared Access Signature (SAS). You would then use this to create a SQL Server Credential on a newly created container. The advantage of this strategy is that you can grant clients access to resources in your storage account without sharing your account keys. What else stands out with the SAS method is its granularity in terms of setting start/expiry times, assigning different permissions (read, write and delete) and even having the ability to assign IP addresses from where Azure storage will accept the SAS.

At a high level, it is quite similar to the access key technique mentioned in this article, the key difference being the requirement to use Microsoft Azure PowerShell to create a SAS that is associated with a stored access policy.

Once you have done the above (please see references section below) then you would then execute the following:

(SELECT * FROM sys.credentials  
WHERE name = '') 

  SECRET = 'fgfgfjDDFDSF445GDFDGDKO45054Mskd3a'

TO URL = ''; 


When assessing a feature, it is also important to understand its limitations. Here is a list of some limitations when backing up to Azure storage:

  • The maximum backup size is 1TB.
  • You cannot stripe your backups across multiple files.
  • Maximum limit of 259 characters for a backup device name.
  • Cannot append to existing backups files.

Code – references

For fully documented PowerShell snippets to setup SAS by Microsoft please review the following link:

Get in touch

Hopefully after reading this article you now understand the benefits of backing up your local SQL Server database to Azure Storage, the two main techniques – SAS vs access keys and finally an understanding of the limitations. If you would like more information about the requirements, benefits and limitations of a local SQL Server backup to Azure, get in touch.

Looking for more help? We offer both software consultancy and development to help you with any Microsoft custom enterprise application that needs development or troubleshooting.

Get in touch

Post Terms: Azure | back up storage | Blob | blob storage | Database | Microsoft Cloud | Microsoft MVP | On-Premises

About the Author

Arun Sirpal, writing here as a freelance blogger, is a four-time former Data Platform MVP, specialising in Microsoft Azure and Database technology. A frequent writer, his articles have been published on SQL Server Central and Microsoft TechNet alongside his own personal website. During 2017/2018 he worked with the Microsoft SQL Server product team on testing the vNext adaptive query processing feature and other Azure product groups. Arun is a member of Microsoft’s Azure Advisors and SQL Advisors groups and frequently talks about Azure SQL Database.

Education, Membership & Awards

Arun graduated from Aston University in 2007 with a BSc (Hon) in Computer Science and Business. He went on to work as an SQL Analyst, SQL DBA and later as a Senior SQL DBA, DBA Team Lead and now Cloud Solution Architect. Alongside his professional progress, Arun became a member of the Professional Association for SQL Server. He became a Microsoft Most Valued Professional (MVP) in November 2017 and has since won it for the fourth time.

You can find Arun online at:

Back To Top
Contact us for a chat