skip to Main Content

Backing up Azure Cosmos DB

Looking for more help? We offer both software consultancy and development to help you with any Microsoft custom enterprise application that needs development or troubleshooting.

Get in touch

If you have started using it, you may have found that backing up Azure Cosmos DB is pretty limited by default. Having recently deleted a test database by mistake and waited three days to get round the support maze to have it restored, it was a good chance to implement a more robust backup strategy.

The simplest way I have found to do this is to backup the entire DB using Azure Data Factory. Naturally it may depend on your DB size and you may want to instead backup the change feed on a more regular basis, but this worked for well for me in the scenario I encountered.

Steps to Backing Up Azure Cosmos DB

To get started with Azure Data Factory:

  • Add it as a service through the Azure Portal
  • Click on Author and Monitor to be taken to the homepage
  • Click on Copy Data

Backing up Azure Cosmos DB | AzurePortalDataFactory

Backing up Azure Cosmos DB | AzureDFStartScreen

  • Fill out the name for your task
  • Add a Cosmos DB Linked Service as a source
  • Fill out the connection details

 

Backing up Azure Cosmos DB | AzureDFCreateLinkedService

 

Backing up Azure Cosmos DB | AzureDFCosmosConnection

  • Select the Cosmos collection (Items in the case below)
  • Select “Export as-is to JSON files”

Backing up Azure Cosmos DB |AzureDFCosmosQuery

  • Create a Destination Source
  • Use a new Azure Blob Storage linked service

Backing up Azure Cosmos DB | AzureDFCreateLinkedServiceStorage

  • Define the file path and name, using a temporary name for now

Backing up Azure Cosmos DB | AzureDFSelectFolder

  • Click several Nexts until the wizard completes and creates the Pipeline
  • Click on the pencil icon an navigate through the Pipelines to find your newly created Copy Data Pipeline
  • Click on the Copy Data activity and navigate to the Sink section
  • Edit the Sink Dataset
  • Find the filepath, select backup.json and select Add Dynamic Content 

Backing up Azure Cosmos DB | AzureDFUpdateFolder

  • Change the value to “@concat(utcnow(),’.json’)” so that each run will give the file a unique name

This completes it all so that you can now Debug to test it and then Publish when all ready. Depending on your preferences, you can create a Trigger to run this every day or to whatever frequency you need. To restore these backups, create a similar Copy Data task going back the other way.

Backing up Azure Cosmos DB in this way provides some reassurance that should you accidentally delete in the portal it can be easily recovered.

By Kevin McDonnell, Senior Technical Architect at Ballard Chalmers


About the author

Kevin McDonnell is a respected Senior Technical Architect at Ballard Chalmers. With a Master of Engineering (MEng), Engineering Science degree from the University of Oxford he specialises in .NET & Azure development and has a broad understanding of the wider Microsoft stack. He listens to what clients are looking to achieve and helps identify the best platform and solution to deliver on that. Kevin regularly blogs on Digital Workplace topics and is a regular contributor to the monthly #CollabTalk discussions on Twitter.

 


 

Post Terms: Azure | Azure Cosmos DB

About the Author

Our technical team contribute with blogs from their respective specialities, be that Azure, SQL, BizTalk, SharePoint, Xamarin and more. From the lead architect to developers and testers, each person provides content straight from their experience.

Back To Top
Contact us for a chat