skip to Main Content

SQL Server Backup to URL

There are strong synergistic benefits between SQL Server and Azure storage and this fact has led Microsoft to give us many techniques which allow us to create a local SQL Server back up database to Azure storage. There are two main ways that stand out; file snap-shot backups for database files in Azure blob storage, and SQL Server backup to URL. We will be discussing the latter of the two.

Benefits of SQL Server backup to URL

Why would you want to use Azure storage as the destination for your SQL Server backup?

      • No hardware overhead: There is no overhead of hardware management. Azure services manage the hardware from a storage perspective.
      • High storage capacity: By backing up directly to Azure storage you will be able to access 500TB of space per storage account.
      • Backup availability: Backups stored in Azure blobs are available from anywhere and at any time.
      • Storage Replication benefits: Assuming you have setup Geo-Redundant Storage (GRS) for your storage account, it is designed to provide at least 99.99999999999999% (16 9’s) durability of objects over a given year by replicating your data to a secondary region that is hundreds of miles away from the primary region.
      • Cost: This can be cost-effective as an off-site and backup archive option, which would be the case for my SQL Server backups.


To successfully set up a SQL Server backup to URL you will need to the have the following, which we will cover in the upcoming sections.

      • A Microsoft Account.
      • Storage Account within Azure.
      • Container for your backups.
      • SQL Server credential.

Set up

Login to your Microsoft Azure account via the Azure portal and navigate to Storage accounts.

You will be presented with the main overview page. You will need to click the blobs option.

From here you will then need to create a container to hold your database backups. It is Microsoft best practice to set access to private for this container.


There are a couple of important things needed now. You will need the storage account name, the blob service endpoint and the storage access key (key 1 or key 2) which can all be found within the portal as highlighted via the black boxes below.

The code

Now we have all the necessary information to write the T-SQL to a local SQL Server backup database in Azure Storage using storage account identity and access key technique.

The first step is to create a SQL Server credential where the IDENTITY is the name of the storage account and the secret being either key 1 or key 2 (obfuscated for this article).

CREATE CREDENTIAL Backupcredential  
WITH IDENTITY= 'cloudspacestorage'

Within SSMS (SQL Server Management Studio) you can confirm the creation of the credential.

Then finally the code for the actual backup where we state the endpoint via “TO URL” whilst using the credential from the step above.

TO URL = ''  
  WITH CREDENTIAL = 'Backupcredential', STATS = 10

Processed 312 pages for database 'TestDB', file 'TestDB' on file 1.
100 percent processed.
Processed 3 pages for database 'TestDB', file 'TestDB_log' on file 1.
BACKUP DATABASE successfully processed 315 pages in 0.691 seconds (3.561 MB/sec).


You can confirm that the backup has been created by connecting to the storage account which I have done via SSMS below.


Restoring from the backup file that is located in Azure Storage is very similar to that of a normal restore statement. The key differences being the source location and the use of the credential.

USE [master]
  CREDENTIAL = 'Backupcredential'

Processed 312 pages for database 'TestDBCopy', file 'TestDB' on file 1.
Processed 3 pages for database 'TestDBCopy', file 'TestDB_log' on file 1.
RESTORE DATABASE successfully processed 315 pages in 0.136 seconds (18.095 MB/sec).

What about SAS?

Technically you could go a different route and use a Shared Access Signature (SAS). You would then use this to create a SQL Server Credential on a newly created container. The advantage of this strategy is that you can grant clients access to resources in your storage account without sharing your account keys. What else stands out with the SAS method is its granularity in terms of setting start / expiry times, assigning different permissions (read, write and delete) and even having the ability to assign IP addresses from where Azure storage will accept the SAS.

At a high level it is quite similar to the access key technique mentioned in this article, the key difference being the requirement to use Microsoft Azure PowerShell to create a SAS that is associated with a stored access policy.

Once you have done the above (please see references section below) then you would then execute the following:

(SELECT * FROM sys.credentials  
WHERE name = '') 

  SECRET = 'fgfgfjDDFDSF445GDFDGDKO45054Mskd3a'

TO URL = ''; 


When assessing a feature, it is also important to understand its limitations. Here is a list of some limitations when backing up to Azure storage:

  • Maximum backup size is 1TB.
  • You cannot stripe your backups across multiple files.
  • Maximum limit of 259 characters for a backup device name.
  • Cannot append to existing backups files.

Code – references

For fully documented PowerShell snippets to setup SAS by Microsoft please review the following link:

Get in touch

Hopefully after reading this article you now understand the benefits of backing up your local SQL Server database to Azure Storage, the two main techniques – SAS vs access keys and finally an understanding on the limitations. If you would like more information about the requirements, benefits and limitations of a local SQL Server backup to Azure, get in touch.

By Arun Sirpal (Microsoft MVP), Editorial Contributor



Copyright © 2018 Arun Sirpal: The right of Arun Sirpal to be identified as the author of this work has been asserted in accordance with the Copyright, Designs and Patents Act 1988. All rights reserved. No part of this publication may be reproduced, stored or introduced into a retrieval system, or transmitted, in any form, or by any means (electronic, mechanical, photocopying, recording or otherwise) without the prior written permission of the publisher, Ballard Chalmers. If consent is given, please give credit. Requests to re-print should be sent to [email protected].



Arun Sirpal is a Microsoft MVP, specialising in SQL Server architecture, administration, development, configuration and the optimisation of physical and virtual SQL Servers. His articles have been published on SQL Server Central and Microsoft TechNet. In 2015 Arun was mentored by the highly respected long-time SQL Server MVP, Paul Randal, and he also worked with the Microsoft SQL Server product team on testing the vNext adaptive query processing feature. Arun is a member of Microsoft’s Azure Advisors and SQL Advisors groups, as well as an accomplished Microsoft Certified Solutions Expert on the Data Platform.


Post Terms: Arun Sirpal | Azure | back up storage | Blob | blob storage | Database | Microsoft Cloud | Microsoft MVP | On-Premises

About the Author

BC Technical Team

Our technical team contribute with blogs from their respective specialities, be that Azure, SQL, BizTalk, SharePoint, Xamarin and more. From the lead architect to developers and testers, each person provides content straight from their experience.

Our Technologies



Ballard Chalmers is one of the UK’s leading software development companies, specialising in engineering enterprise applications on the Microsoft Platform. We are principally dedicated to custom software development and integration for the Microsoft Cloud, as well as on-premises and hybrid systems. Our expertise is in Microsoft .NET and Azure, SQL Server, BizTalk Server, SharePoint Server and Xamarin.

01342 410223

[email protected]



This form collects your name and details so that we may answer you. Check out our Privacy Policy for full details on how we protect and manage your data.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

© 2020 Ballard Chalmers Ltd. All Rights Reserved

Back To Top